5 Top Crawl Stats Insights in Google Search Console

There is one report in Google Search Console that’s each insanely helpful and fairly arduous to seek out, particularly in case you’re simply beginning your search engine marketing journey.

It’s one of the highly effective instruments for each search engine marketing skilled, regardless that you possibly can’t even entry it from inside Google Search Console’s primary interface.

I’m speaking concerning the Crawl stats report.

In this text, you’ll study why this report is so necessary, find out how to entry it, and find out how to use it for search engine marketing benefit.

How Is Your Website Crawled?

Crawl funds (the variety of pages Googlebot can and needs to crawl) is crucial for search engine marketing, particularly for big web sites.

If you have got points along with your web site’s crawl funds, Google might not index a few of your priceless pages.


Continue Reading Below

And because the saying goes, if Google didn’t index one thing, then it doesn’t exist.

Google Search Console can present you what number of pages in your website are visited by Googlebot on daily basis.

Armed with this information, you will discover anomalies which may be inflicting your search engine marketing points.

Diving Into Your Crawl Stats: 5 Key Insights

To entry your Crawl stats report, log in to your Google Search Console account and navigate to Settings > Crawl stats.

Here are the entire information dimensions you possibly can examine contained in the Crawl stats report:

1. Host

Imagine you have got an ecommerce store on store.web site.com and a weblog on weblog.web site.com.

Using the Crawl stats report, you possibly can simply see the crawl stats associated to every subdomain of your web site.

Unfortunately, this technique doesn’t presently work with subfolders.

2. HTTP Status

One different use case for the Crawl stats report is wanting on the standing codes of crawled URLs.

That’s since you don’t need Googlebot to spend assets crawling pages that aren’t HTTP 200 OK. It’s a waste of your crawl funds.


Continue Reading Below

To see the breakdown of the crawled URLs per standing code, go to Settings > Crawl Stats > Crawl requests breakdown.

Google Search Console's Crawl stats report showing a breakdown of crawled URLs per HTTP response type.

In this specific case, 16% of all requests have been made for redirected pages.

If you see statistics like these, I like to recommend additional investigating and searching for redirect hops and different potential points.

In my opinion, one of many worst circumstances you possibly can see right here is a considerable amount of 5xx errors.

To quote Google’s documentation: “If the site slows down or responds with server errors, the limit goes down and Googlebot crawls less.”

If you’re in this matter, Roger Montti wrote an in depth article on 5xx errors in Google Search Console.

3. Purpose

The Crawl stats report breaks down the crawling goal into two classes:

  • URLs crawled for Refresh functions (a recrawl of already identified pages, e.g., Googlebot is visiting your homepage to find new hyperlinks and content material).
  • URLs crawled for Discovery functions (URLs that have been crawled for the primary time).

This breakdown is insanely helpful, and right here’s an instance:

I just lately encountered a web site with ~1 million pages labeled as “Discovered – currently not indexed.”

This concern was reported for 90% of all of the pages on that website.

(If you’re not accustomed to it, “Discovered but not index” implies that Google found a given web page however didn’t go to it. If you found a brand new restaurant in your city however didn’t give it a attempt, for instance.)


Continue Reading Below

One of the choices was to attend, hoping for Google to index these pages steadily.

Another choice was to take a look at the info and diagnose the difficulty.

So I logged in to Google Search Console and navigated to Settings > Crawl Stats > Crawl Requests: HTML.

It turned out that, on common, Google was visiting solely 7460 pages on that web site per day.

A chart showing an ecommerce website's crawl statistics.

But right here’s one thing much more necessary.


Continue Reading Below

Thanks to the Crawl stats report, I discovered that solely 35% of those 7460 URLs have been crawled for discovery causes.


That’s simply 2611 new pages found by Google per day.

2611 out of over one million.

It would take 382 days for Google to totally index the entire web site at that tempo.

Finding this out was a gamechanger. All different search optimizations have been postponed as we absolutely centered on crawl funds optimization.


Continue Reading Below

4. File Type

GSC Crawl stats could be useful for JavaScript web sites. You can simply test how continuously Googlebot crawls JS recordsdata which can be required for correct rendering.

If your website is full of photographs and picture search is essential to your search engine marketing technique, this report will assist lots as effectively – you possibly can see how effectively Googlebot can crawl your photographs.

5. Googlebot Type

Finally, the Crawl stats report offers you an in depth breakdown of the Googlebot sort used to crawl your website.

You can discover out the proportion of requests made by both Mobile or Desktop Googlebot and Image, Video, and Ads bots.

Other Useful Information

It’s price noting that the Crawl stats report has invaluable info that you just gained’t discover in your server logs:

  1. DNS errors.
  2. Page timeouts.
  3. Host points equivalent to issues fetching the robots.txt file.

Using Crawl Stats in the URL Inspection Tool

You may also entry some granular crawl information outdoors of the Crawl stats report, in the URL Inspection Tool.


Continue Reading Below

I just lately labored with a big ecommerce web site and, after some preliminary analyses, seen two urgent points:

  1. Many product pages weren’t listed in Google.
  2. There was no inside linking between merchandise. The solely method for Google to find new content material was by sitemaps and paginated class pages.

A pure subsequent step was to entry server logs and test if Google had crawled the paginated class pages.

But having access to server logs is commonly actually troublesome, particularly while you’re working with a big group.

Google Search Console’s Crawl stats report got here to the rescue.

Let me information you thru the method I used and that you should utilize in case you’re scuffling with an analogous concern:

1. First, search for a URL in the URL Inspection Tool. I selected one of many paginated pages from one of many primary classes of the location.

2. Then, navigate to the Coverage > Crawl report.

Google Search Console's URL Inspection Tool allows you to look up a given URL's last crawled date..

In this case, the URL was final crawled three months in the past.


Continue Reading Below

Keep in thoughts that this was one of many primary class pages of the web site that hadn’t been crawled for over three months!

I went deeper and checked a pattern of different class pages.

It turned out that Googlebot by no means visited many primary class pages. Many of them are nonetheless unknown to Google.

I don’t suppose I would like to elucidate how essential it’s to have that info while you’re engaged on enhancing any web site’s visibility.

The Crawl stats report permits you to look issues like this up inside minutes.

Wrapping Up

As you possibly can see, the Crawl stats report is a strong search engine marketing software regardless that you could possibly use Google Search Console for years with out ever discovering it.

It will allow you to diagnose indexing points and optimize your crawl funds in order that Google can discover and index your priceless content material rapidly, which is especially necessary for big websites.

I gave you a few use circumstances to think about, however now the ball is in your court docket.


Continue Reading Below

How will you utilize this information to enhance your website’s visibility?

More Resources:

Image Credits

All screenshots taken by writer, April 2021

      Enable registration in settings - general