Google Mueller on the crawl rate for large and small websites

Google’s John Mueller asked the SEO community why the URL submission tool was missing. One person said that small websites are disadvantaged because larger websites are crawled more often. John Mueller gave insights into how websites are crawled by Google.

Are large websites crawled more often?

It is possible that popular and more frequently linked websites will be crawled more frequently. Links are part of the crawling process as Googlebot crawls from link to link.

It is therefore not unreasonable to assume that popular websites are crawled more often than less popular websites.

Here is John Mueller’s original question::

“I’ve seen people look forward to the URL submission tool being back. I don’t have any new messages, but I’d like to know more about why you’re missing out.

Let me know which urls you’re missing the tool for and how you’ve used it in general. Many thanks!”

advertising

Read on below

And the Editor replied::

“@ JohnMu,
They know that crawlers don’t visit small sites as often as large ones. We therefore rely on the URL submission tool for updates on key pages and faster indexing. If you remove access to the tool, you will give preference to large sites and hit small ones. “

John Mueller answered This crawling is independent of the size of a website.

advertising

Read on below

“The crawling is independent of the size of the website. Some websites have a ton of (useless) URLs, and luckily we don’t crawl away from them much. If you have an example from your website that is having problems, you can add it to the form. “

The person who asked the question stated that some publishing platforms don’t automatically update their sitemaps.

John Mueller suggested updating the platform so that the sitemap would be updated automatically. This is the simpler solution.

Mueller’s answer::

“There are still websites that don’t use sitemaps? Seems like a much easier solution than manually submitting each new or updated URL …

… Sounds like something that needs to be fixed in this case :). Manual submissions are never scalable. Make sure they work automatically.

Creating a sitemap file automatically seems like a minimal base for any serious website, imo. “

Great site and popular site

I think what was overlooked in the above exchange is what the publisher meant when they said a “big site” has been crawled more often. John Mueller literally answered the question in terms of the number of pages a site contained.

It may not be inappropriate to assume that what the editor was referring to was a more popular site having an advantage over a smaller site that did not have as many links.

Mueller’s reference to larger (and sometimes more popular) websites with useless URLs is a fair point, however. This means that smaller websites are easier and more efficient to crawl. If you can view crawler logs, it seems that Google is visiting smaller and less popular websites quite often.

advertising

Read on below

If days or weeks go by without Google discovering some of your pages, a sitemap can help. It could also indicate deeper problems with the quality of the content or the links.

}// end of scroll user