SEO Diagnosis: How to Read the GSC Crawl Stats Report

When it comes to SEO ROI, the first step is always discoverability. Simply put, if you aren’t found in search engines, people won’t come to your site. Traffic is step 2; a step before that (step 1), is ensuring your site is technically sound for crawling. To that end, speaking strictly about Google, in its effort to be as transparent as it can be, provides site owners with GSC “Crawl Stats” report.

How to Access the GSC Crawl Stats Report

The crawl stats report in GSC is tabled under the section of “Legacy tools and reports.”

how to access crawl stats report in gsc
As demonstrated, crawl stats belongs to the legacy reports and tools section.

The reason crawl stats, and some of the others fall here is because Google hasn’t quite figured out a replacement for them in the new Search Console.

In any event, once you’re in the report, you’ll see something like below:

full snapshot of the crawl stats in google search console
A Snapshot of the three primary metrics available, inside the GSC crawl stats report.

What the Crawl Stats Metrics Tell You?

The primary purpose of the crawl stats report is to inform you about how regularly & how well Google is crawling your website. Additionally, the report gives access to the averages (as seen in the image above). The predominant reason is that if a site owner feels it’s getting too many requests, he or she can appeal for a reduced crawl rate to Google.

Outside of that, they also help with certain diagnostic scenarios — which I’ll cover per metric.

Metric 1: Pages Crawled per Day

Perhaps the most critical of all three, pages crawled per day, as the name suggests, displays how many of those has Google crawled every day over the past 90 days.

Why Should You Pay Attention to It?

  1. This metric is an excellent indicator of consistency, and how friendly your site is to Googlebot. Clean code, mobile-friendliness, and following development best practices makes it easy for Googlebot to understand the pages on your site.
  2. For example, let’s say you’ve made significant improvements on your website’s technicality; you should, over some time, see slight increases in your crawl rate. If you still aren’t, it’s a sign that you need to investigate further. Perhaps there is another issue that’s preventing Google to fully decipher your pages.
    • Other two common scenarios where you should expect the rate to go up is when you’ve mass published tons of new content, opened up new pages/paths in robots.txt.
  3. Conversely, if you’re noticing sudden drops, it’s a signal that something went wrong.
    • It could be that your site has been idle for much longer now, hinting to Google that there isn’t a steady flow of freshness in your content.
    • Something detrimental occurred from the code side of things.
    • Or something more dramatic (while easily fixable) happened, like disallowing additional pages/paths in robots.txt

Metric 2: Kilobytes Downloaded per Day

In its abstract form, this metric serves as evidence of how many kilobytes Googlebot downloaded per day from the pages on your site. Typically, a high number is considered healthy, as it can be construed as Google giving more credence to your site; however, the thinking is a little bit fallacious if not considered within the context of how much time it takes to download the page.

You see, if the number is continuously high, but so is for the time it takes to download a page, it’s not a good sign. If anything, it shows that Googlebot is struggling to assimilate the landing pages or your blog posts.

Metric 3: Time spent downloading a page (in milliseconds)

As the name does not advocate, this metric illustrates the time it takes for Googlebot to make an HTTP request. Please read that again, just so you thoroughly understand it does not calculate the site speed. The metric name is a misnomer where one might assume it’s meant for general site speed. However, it only measures how long it takes for the bot to complete the said HTTP request.

Ideally, you’re looking for a low number here — as that’s the mark of Google not focusing a ton on making requests, but more on crawling and indexing.

A Few Other “Crawl” Related FAQs

As It Relates to Googlebot, How Do Crawl Rate, Crawl Budget, and Crawl Demand Differ From Each Other?

1. Crawl Rate is defined as the frequency with which, Googlebot requests are sent to your site, per second, during its crawling. For example, 10 requests per second. Simply put, it’s the rate of crawling.
2. Crawl Budget is defined as the technicality where Googlebot ensures to crawl your site as much as possible, before it starts to overload your servers.
3. Crawl Demand is the propensity of the Googlebot wanting to crawl your website/a.k.a, how much it wants to crawl. This phenomenon is depended on the industry and the premise of a website. As a hypothetical example (all things being the same), Google will probably have a higher crawl demand for a large government website reporting on Covid-19, than my site.

Can I Influence Any of These Metrics?

Not the way it functions, but yes, you can improve your crawl rate by posting fresh & new authoritative content, having XML sitemaps, consistently making technical networking, rendering, and server fixes, and more. The numbers provided by Google are more of an FYI.

How Can I Limit Googlebot Crawl Rate

I’ve personally haven’t needed to act on this before, but Google gives site owners a chance to request or make tweaks to the crawl rate. The recommended advice is to let Google optimize it; although, you can choose the second option as shown in the attached image, to toggle your crawl rate.
Having that stated, if your crawl rate is described as “optimal,” according to Google, the only way to limit or change crawl rate is by filing a request. how to limit or change google crawl rate

You May Also Want to Check Out:


Generally speaking, you do not have much control over how Googlebot crawls your site. What you can, though, is what you can do to keep asking for its attention.

In the bigger scheme of things — as it relates to GSC crawl stats, what you’re looking for — for all three metrics is consistent graphs for the past 90 days. Any sudden, drastic, and unexpected fluctuation means something happened, which, you’d need to dive into.

And as discussed, if anything, you can reduce your crawl rate or increase it. Do note that just because you want to accelerate the GSC crawl rate, doesn’t mean Google will blindly comply.