Having looked at the sources of their website visitors, and compared the numbers to what third-parties were claiming, one of our clients recently decided to get to the bottom of this discrepancy. One website where the company advertised had reported hundreds of clickthroughs in the past year, yet the client’s Google Analytics was reporting just a few dozen. What was going on? Interestingly, the third-party website was quite prepared to help find out. They were confident in their own data, and, well, let’s face it – not all companies are that great at using Google Analytics.
Here’s what they did (with a little help from us). The third-party website set up an advert for our client on a very obscure page …so obscure that nobody else was likely to click on it during the test day. Our client then clicked on their own advert about a dozen times each, from three different places. What do you think happened?
The third-party website reported 39 “clickthroughs”. Google Analytics reported just 3. Hat-tip to Google Analytics then, for realising that the multiple clicks of the three visitors were in fact only one real visit each time.
So we realised that there was a problem with the third-party website’s reporting. But in real life, people don’t click on an advert a dozen times in quick succession. So we tried another experiment.
The next day, we didn’t click on the advert at all. But we did send a dozen “robots” around to “hit” the advert. Free online link checkers, site crawlers, that sort of thing. They’re readily available to use.
The third-party website dutifully reported all of them as a clickthrough. Google Analytics reported zero visits. And that’s because Google Analytics can tell the difference between human visitors to your website and “robots”.
These “robots” are all over the web. Google’s “Googlebot” probably takes a look at your website every day. And it’s one of many, many such artificial agents permanently “crawling” the web. It’s how the search engines, amongst other services, keep up with what you’re doing.
There, then, was the answer to the discrepancy. Over the course of the year, every time a “robot” came round and inspected the advert, it was noted by the third-party website as a clickthrough, even though it shouldn’t really have been counted. It should only be counting real visits from real people, which is what proper web analytics software does.
However – and perhaps surprisingly – I don’t want to suggest that in our client’s case, the third-party website was being deceptive. I doubt they’d have agreed to the client’s experiment if they were. What I’m almost certain happened is that a long time ago, some IT people were asked to set up a system which counted how many times a link was followed to an advertiser’s website. They weren’t told that the system had to be sophisticated enough to tell humans and “robots” apart. They just produced a system which did as specified – and was probably specified by someone who’d never even heard of a “robot” or considered that it might be a problem.
Today, salespeople at the third-party website, whose knowledge of advertising is better than their knowledge of IT, are given these clickthrough statistics as accurate counts of visitors clicking on the ads. When they say “your advert got 500 clicks”, they probably genuinely believe that their website sent you 500 visitors.
You, however, should know better.