Today I received a newsletter from Google Analytics, the first edition of the first volume read as follows:
This month, we are replacing the standard “benchmarking” report in your Google Analytics account with data shared in this newsletter. We are using this newsletter as an experiment to surface more useful or interesting data to Analytics users. Data contained here comes from all websites which have opted-in anonymous data sharing with Google Analytics. Only those website administrators which have enabled this anonymous data sharing will receive this “benchmarking” newsletter.
The first edition discussed the benchmarks by country, including Bounce Rate:
Time On Site:
And Goal Conversion:
There’s a huge danger to benchmarking your site’s performance to these benchmarks. In fact, I would argue that these are benchmarks at all. Every site is different in structure and content. Every breakdown of traffic sources is different… from search to referral. Load time by country is different… unless you’re utilizing a service to cache your resources geographically. And these questions don’t even include language…
Are the benchmarks for countries only including visits and pageviews for sites within the country with a common language? Or are these sites being translated (which could either take longer or be translated so poorly it increases bounces)? Are the sites ecommerce sites? Blogs? Social sites? Static web pages?
Another problem exists as well. Tools such as Facebook’s Social Plugin are impacting bounce rates significantly because Facebook redirects site users. When a visitor lands on your site and uses the plugin before engaging in any other activity, they are bouncing. Here’s an example from one my clients… you can see where they installed, uninstalled and then installed the Facebook Social Plugin on their site:
My advice to clients is simply to benchmark your site against your own site… no one else’s. Is your bounce rate increasing or decreasing? Are your visitors up or down? Are the number of pageviews per visit up or down? How have you changed your design or content to impact your visitors’ experience? We notice increases in the time visitors stay on site when we embed a video… makes sense, right? But if we don’t embed a similar video each week we really can’t assume we’re doing a poor job.
Two examples on this blog:
- We modified our blog design to show excerpts on our home page. As a result, bounce rate decreased since people clicked through to the post AND pages per visit increased substantially. If I simply showed you the stats without explaining that, it would leave you wondering. Or if you benchmarked us against other sites, we may be better or worse then their results.
- We launched our newsletter. We’ve been adding subscribers consistently since adding the newsletter and these visitors are returning as they read it. As a result, on days the newsletter is delivered, our number of pageviews is much higher – and our weekly average has increased close to 20%. If we’re benchmarking ourselves against other sites, do they have a newsletter? Do they publish excerpts? Do they aggregate their content socially?
Simply put, in my opinion, benchmarks don’t provide any meaningful data for me to improve my site. I also haven’t been able to utilize benchmarks with my clients’ sites. The only benchmark that matters are the ones we record for our own site as each week passes. Unless Google can provide clearer segmentation within their benchmarks to compare sites accurately, the information is useless. Providing this information to leaders within an organization could really do some damage… I wish Google would simply abandon this product feature.
Today's digitally empowered customers create a challenge for organizations to sell, market and service them effectively. Expectations are higher than ever before, and customers openly share both positive and negative experiences with just a few clicks on review websites, app ratings and social media.