You may be basing your marketing decisions on bad data.
Have you seen a rise in traffic to your blog or landing pages?
Don’t get excited — yet.
These numbers might represent quality traffic but they could also indicate a spam or ghost referral problem. This is a growing issue that is contributing to inaccurate data within Google Analytics reports.
Spammy web crawlers and ghost referral traffic are two potential culprits behind the skewed data in your Google Analytics reports.
If a web crawler that is indexing your site fails to identify itself to the web server then it will end up as a traffic stat. This means that the numbers that represent real visitors to your website will be clumped in with traffic from scrapers, bots and other web spiders.
Ghost referral traffic also mucks up your traffic numbers but it achieves this without actually visiting your site. Instead, spammers use programs to send fake HTTP requests that show up in Google Analytics reports as a visit to a page on your website.
You can identify this type of problem by taking a closer look at the user sessions in your Google Analytics reports.
The session above display a 100% bounce rate and 0 second duration. If your reports contain similar results then it may be a good idea to put a filter in place to remove these “fake” sessions.
Last summer, Google announced a new bot and spider filter to help clean up your reports.
While this filter only works to filter out traffic from a list of identified bots, it can definitely help you distinguish web crawler traffic from human visitors. To activate this filter:
After you activate this setting you will probably notice a drop in traffic in your Google Analytics reports. While this can be an unpleasant experience, your numbers will be more accurate, which can be used to drive better marketing decisions.