Some of you might have noticed a horrible creeping problem in Google Analytics: traffic from junk sources such as web bots and spiders. Remove this non-human traffic with a new filtering option from Google.
If you’ve ever seen traffic in your Google analytics from a source called SEMALT it’s total junk. Some kind of semi-legitimate automated program that is spidering your website looking for SEO information (to sell back to you? or your competitors?) For low traffic sites or for niche pages it can really throw your stats off. (and there are more where it came from).
Google have countered this by adding a setting in your Google Analytics where you can tick a box under Bot Filtering to “Exclude all hits from known bots and spiders“. From that point forwards (NOT retrospectively) Google Analytics will ignore all such traffic.
How does it know what to ignore? It bases it on an independently maintained list of “bots” identified by the global community; so it should keep up to date with all the latest releases.
Why didn’t they turn it on automatically? This could make a big change to some people’s traffic reports (and often what is more important in SEO is the relative numbers month on month, not the absolute numbers).
So, armed with the understanding that you might see a traffic drop – but in future see only real human traffic – head into Bot Filtering in your Google Analytics settings and tell spam bots and spiders where to go…