UPDATE July 2015: This is a never-ending topic. For the latest ideas, we recommend you read this article on the Moz blog instead.
Your website is being hit all day long by automated services, including the “spiders” or “crawlers” from the search engines which are hoovering up everything you publish, and constantly checking for changes. If you try to analyse your website traffic from looking at the log file of all visits, the real human visitors can be hard to spot.
Fortunately, services such as Google Analytics ignore most of this automated traffic by the way they record visits. There are exceptions, however, and I’ve writen here in the past about how to filter out the non-human traffic which does get through. A recent example which many of you will have seen is the regular visits from a site called “semalt.com”. You don’t want this in your traffic figures.
Now Google Analytics has just anounced a more general setting which will automatically filter out any visits on the IAB/ABC International Spiders & Bots List which might have otherwise got through because of the code they execute when visiting. I assume this will include all unwelcome visitors such as semalt.com, so it should save us a lot of manual filter creation in the future. (Update: no, it doesn’t).
How to set up Bot Filtering in Google Analytics
You’ll need to do this on every “view” which you have in Google Analytics. Best practice is always to have a view called “raw data” which you never modify, and additional views which you use on an everyday basis with any filters you may need. So if you do have a “raw data” view, don’t touch that one. For the others, go to “Admin”, select each “View” in turn, and under “View Settings”, tick the “Exclude all hits from known bots and spiders” box, as shown. Click “Save” and repeat on any other views if you have them.
BMON Google AdWords management clients can just drop us an email and we’ll do all this for you – no charge, of course.