Google understands that Bot visits can muddy content analytics reports, so it updated its reporting functions to remove them from results.

One of the problems we all face with SEO and content marketing strategies is measurement. Even if you’re fluent in Google Analytics, you depend on the data being accurate if you’re going to glean useful insights. Google seems to understand there’s a desire for cleaner data, and in a Google+ post, Matthew Anderson announced that it will soon be possible to get a clear picture with a new function that removes visits from Spiders and Bots, showing just the visits from human searchers.  

Some sites, like Nestle, have already been given access to this functionality and found that it helps them get better insights about their websites’ performances, Anderson reported.

The feature can be applied in the Admin section of Google Analytics under Reporting view settings by checking the box for “Bot Filtering.” It doesn’t work retroactively, but it can help with future data retrieval by making sure traffic numbers don’t include visits from search crawlers.

While it’s a step in the right direction, Bot filtering doesn’t resolve all the concerns revolving around Google’s traffic counts. Brafton recently covered an experiment Groupon conducted that found Google was attributing up to 60 percent of its search traffic as direct visits. This degree of misattribution can throw a marketing campaign off track and make it look like SEO efforts aren’t working.

It’s important for marketers to look at the big picture when analyzing content analytics data, consider the results from different angles and pull from multiple sources to more forward in the right direction.

Lauren Kaye is a Marketing Editor at Brafton Inc. She studied creative and technical writing at Virginia Tech before pursuing the digital frontier and finding content marketing was the best place to put her passions to work. Lauren also writes creative short fiction, hikes in New England and appreciates a good book recommendation.