Home Topics SEO Why the collection volume will decrease

Why the collection volume will decrease

May 24, 2019 pm 05:37 PM

The reasons why the number of included items will decrease: 1. The website positioning is not accurate; 2. The website has been significantly revised; 3. The value of the website content is not high; 4. The quality of the external links is not high. When building a website, try not to change the URL structure at will, and do not include low-value content.

Why the collection volume will decrease

#The problem of website inclusion is a problem that every webmaster must solve. I explained before why Baidu does not include my site, but in reality the total number of included sites is gradually decreasing, which is obviously moving in the opposite direction. Everyone who builds a website has a traffic model in mind. The inclusion is directly proportional to the traffic. Once the inclusion decreases, you should examine yourself to see if there is a problem with the website or with the usual optimization methods.

I believe many people will encounter this problem. Today I also saw someone asking this question in the QQ group. The author will share the reasons for the decline in website inclusion on this issue.

Four reasons for the decrease in included volume:

First, the website positioning is not accurate

The website is in operation, Suddenly I found that what I was doing was deviated from the theme of my website, and then I made modifications. Every modification, whether it is a word or a word, will cause fluctuations in the Baidu database. The result of the fluctuation is a change in weight distribution, which affects the ranking. Therefore, the webmaster's weak heart is anxious with Baidu.

If you frequently trigger Baidu's mechanism, you will be directly placed into the observation period. No matter how hard you work, you will not see results immediately. At that time, it is normal for the total number of inclusions to decrease.

Second, the website will be significantly revised

When the website is revised, it will inevitably change the structure, in order to standardize all the content of the website, and to present better content to users , revision is an important means for many webmasters to innovate themselves. After the revision, before the internal links have time to change the URL, the spider will encounter a lot of 404 pages when re-crawling, causing the crawling path to be interrupted and unable to crawl more fresh content, so the collection volume will be reduced.

The advice given here is to never change the URL structure at will. Even if you follow up with 301 immediately, you will still suffer losses. It’s just a matter of more or less. The current solution is the revision tool of Baidu Webmaster Platform. The old and new content coexist for some time before setting 301.

Third, the content is worthless

At present, SEO is still an era when content is king. If a lot of the content of your website is collected, no modifications are made. , even the background color of other people’s text is posted directly, and it is also mixed with invalid div tags. This format affects reading. Basically, users turn it on and off at the click of a button, or they can’t stand it after 10 seconds. This user experience is really terrible.

There is a saying in SEO circles that if users don’t like your website, search engines won’t like it either! Even if worthless content is included, it will be gradually reduced later.

Fourth, the quality of external links is not high

Now the role of external links has been reduced. The obvious phenomenon is that in the past, low-quality external links could be eliminated by accumulating external links. The site is pushed to the homepage of the ranking, but now it is obvious that external links are not as effective as before. But it is undeniable that external links are still an important bridge for spiders to crawl content. The higher the quality of external links, the more spiders like them.

If the overall weight of the external link platform is reduced, the search engine will lower the score of the external link that was considered to be high quality in the past and turn it into an external link of average quality. With the development of the entire Internet website ecology, the page The inclusion requirements are bound to be getting higher and higher, and those pages whose weight is not high enough will definitely be eliminated.

The above are the author’s 4 points of experience on the reduction of website inclusion. When building a website, you should think about it based on your own actual situation. As long as you maintain a peaceful mind and choose a method that suits you to optimize the website, whether it is reduction of inclusion or reduction of inclusion, Even if you are demoted, you can still have your appearance unchanged even if Mount Tai collapses.

The above is the detailed content of Why the collection volume will decrease. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Google drops AI while browsing feature Google drops AI while browsing feature Apr 02, 2025 am 09:25 AM

Google's "AI while browsing" feature, previously known as "SGE while browsing," has been discontinued. While Google hasn't publicly stated the reason, the feature's removal is documented in their help section. What was AI while b

Google March 2025 core update rollout is now complete Google March 2025 core update rollout is now complete Apr 02, 2025 am 09:24 AM

The March 2025 Google Core Update: A Comprehensive Analysis Google's March 2025 core update, which began on March 13th and concluded on March 27th, is now complete. This update, a standard adjustment to Google's core ranking algorithm, aimed to enha

The next wave of search: AI Mode, deep research and beyond The next wave of search: AI Mode, deep research and beyond Apr 01, 2025 am 11:49 AM

AI is transforming search engines from information directors to direct answer providers. This shift impacts SEO, content discovery, and digital marketing, prompting questions about the future of search. Recent AI advancements are accelerating this ch

The new SEO imperative: Building your brand The new SEO imperative: Building your brand Apr 08, 2025 am 11:28 AM

In 2025, SEO strategies must evolve beyond Google's search engine to encompass the broader landscape of multi-modal search. Search behavior is increasingly dispersed across various platforms – including AI-powered search, TikTok, Reddit, and YouTube

Meet LLMs.txt, a proposed standard for AI website content crawling Meet LLMs.txt, a proposed standard for AI website content crawling Apr 01, 2025 am 11:52 AM

Jeremy Howard, an Australian technologist, proposes a new standard, llms.txt, designed to improve how large language models (LLMs) access and index website content. This standard, similar to robots.txt and XML sitemaps, aims to streamline the proces

Pagination and SEO: What you need to know in 2025 Pagination and SEO: What you need to know in 2025 Apr 01, 2025 am 11:54 AM

Why Your Ecommerce Products and Blog Posts Might Be Invisible to Google: The Pagination Puzzle Is your website's pagination hindering its Google search ranking? This article delves into the complexities of pagination, its SEO implications, and its r

As AI scraping surges, AI search traffic fails to follow: Report As AI scraping surges, AI search traffic fails to follow: Report Apr 12, 2025 am 11:12 AM

AI search engines contribute little to publishers' traffic, which in turn has intensified web crawling behavior. This is an important finding in the recent report of TollBit, a content monetization platform. Click-through rate comparison: The report shows that the average click-through rate of Google search is 8.63%. However, the click-through rate of AI search engines is only 0.74%, while the click-through rate of AI chatbots is even lower, only 0.33%. This means that AI search brings a 91% reduction in recommended traffic than traditional searches, while chatbots bring a 96% reduction in traffic. Important: This is bad news for publishers because it shows that AI search does not replace traditional search traffic. This trend is expected to continue as AI-generated answers replace direct access to the website. number

See all articles