Encouraging Google to love visiting our websites

How often does Google’s crawler come round and take a look at your site? How deep into the site does it go? These are quite important issues in SEO. Earlier this year the search engine produced an interesting blog post which we can learn from.

It seems there’s something called ‘crawl budget’, which is the number of pages on your site which Googlebot can and wants to crawl. Crawl budget is a combination of crawl rate and crawl demand.

Crawl rate is the speed at which Google can get pages from your site. It appears to be based on what the crawler has been able to achieve in the past, which is affected by the response speed and the number of errors. It can be manually limited in Search Console, but you wouldn’t normally want to do that.

Crawl demand is a function of how popular your pages are, and how often they tend to change. In the sorts of markets where we operate, it’s unlikely those are going to be very impressive, so don’t expect Googlebot to be visiting all day, every day. You can see what it’s doing in Search Console, by the way.

So what can we do to improve our ‘crawl budget’? Number one priority is making pages quick to load. If that’s been a theme this year, I make no apologies. Pages need to load in under 2 seconds. Also ensure you keep on top of crawl errors. Again, Search Console is your friend. Finally, ensure you don’t have any pages with infinitely-adding URLs: some calendar add-ons can do this.