Checking Out Google Trends For Websites
I need to start this post with an important disclosure. I was on the Board of ComScore for nine years from the summer of 1999 until earlier this month. And I am still a shareholder in ComScore through the Flatiron partnerships. In no way is this post intended as a plug for ComScore. And I cannot and will not disclose any confidential information about ComScore in this post.
Now that I have that on the table, I’d like to share my impressions after a first look at Google’s new Trends For Websites service. I’ve been involved in the web measurement market for almost ten years and we use third party measurement services every day in our job as investors in Internet companies. I’ve blogged about our approach (USV’s) to web measurement in the past.
In my opinion, there is no service that gets third party web traffic measurement exactly right. We like to triangulate at Union Square Ventures. We use ComScore actively and believe their data is the most accurate of all the services. But it is not perfect and it is particularly weak at measuring web sites with small (sub 500k uniques) audiences. We also use Compete, Alexa, and Quantcast on almost a daily basis. And we will most certainly start using Google’s Trends For Websites.
There are a lot of misconceptions about third party measurement services out there. They are on display in the comments in this Techcrunch post. People say things like "ComScore’s data is not worldwide" or "Compete doesn’t have good statistical algorithms." Neither statement is true. ComScore, Compete, Hitwise, and NetRatings have been around for a long time and have invested heavily in the statistical talent that is required to get a third party measurement service right. And certainly a bunch of the leading services, particularly ComScore and NetRatings have built large international panels and report international data.
Google’s service reminds me more of Alexa than anything else. And I don’t think it’s particularly accurate. Here’s an example. We run monthly queries of all the third party services against our entire portfolio and share that data internally in our firm. I am very familiar with the numbers on our portfolio companies, both what their own analytics and server logs are telling them and what the third party services tell them.
Here are three charts of Twitter, Etsy, and Indeed. I picked these three services because they have large enough audiences which makes them fairly easy to measure by third parties and have been around a while and are well known.
This is what ComScore says the worldwide unique visitor trends have been over the past year:
This is what Compete says:
And this is what Google Trends says:
Since we are investors in these three companies and know what their internal numbers are saying, I can safely say that ComScore and Compete are lower but directionally correct. Google is like Alexa in that they don’t report absolute numbers but even if they did, they are not directionally correct on this particular set of companies.
Once again, this story demonstrates that many in the tech blog world don’t really understand the third party measurement market. Google announces that they are offering a website traffic measurement service and everyone instantly assumes it is going to be great. I don’t think it is, at least not yet.
Mike Arrington points out in this post that,
all the data is being gathered by Google for the product is from Google
users (their toolbar, for example), the data for Google’s sites would
be skewed to 100% of all Internet users. It points out an inherent flaw
in the product, and I’m not sure Google can easily solve it.
This is an important fact. Everyone who provides third party measurement starts out with a data set that is skewed in some way. The trick is to understand how your data set is skewed and apply statistics to take that bias out. Firms like ComScore, Compete, Hitwise, and NetRatings who sell their data have invested heavily for many years in reconciling their data to server logs and internal analytics. And that makes their data better.
If Google wants to be a player in this market, they’ll have to invest time and energy doing the same. It will be interesting to see if they will do that. If not, they’ll be more like Alexa. A helpful tool but not one that you can completely rely on.
I was somewhat surprised that Google didn’t try to use data from Google Analytics (where available).I’m not a big fan of “partial” metrics. Visitors is important, but it’s one metric. You could have a site with one million unique monthly visitors and five million page views or a site with four million monthly uniques and five million page views. Based only on the unique visitor data, you’d have a very skewed view, even if it was directionally correct.Because Alexa, Compete and now Google are free and widely cited, they will continue to be. I remain a fan of Quantcast at least when sites are quantified and not based on panel data because it tracks so closely with the Google Analytics data. For us, Trends for Websites didn’t track nearly as closely with Google Analytics.
All the more reason that triangulation is the best approach at this time
Indeed gets that kind of traffic??!!
You betIts a monster
fred, if you login to the google trends (top right) – it will show you the numbers on those graphs.
Thanks allenThat’s very helpfull appreciate the tipFred
Great post.Full Disclosure: I co-founded Compete and am still an adviser to them.We in the web analytics world have been waiting for Google to allow users to share their Analytics numbers publicly. I’m very sure that is coming soon and this along with their GA benchmarking opt-in feature they released a few months ago are just small tests as they try and get the experience right.I’m all for it, the more transparency in numbers the better. Because all data are dirty and biased the best way to create a level measuring service is ubiquity. If the market agrees on a local analytic standard, like GA, then we can get closer to the “truth”, that would be best for everyone including Compete, Comscore, Nielsen, etc.For years I tried to push forward on a local auditing option, similar to what Quantcast is doing now with their quantified program. My biggest regret is not having brought that service to life at Compete. Our competitor Nielsen was way to happy to sue anyone in the local analytics space (except the big guys like Google and Yahoo of course) for patent infringement, http://tinyurl.com/6l2r5f. Being a smaller direct competitor we opted not to launch that service and take the conservative route; I know better than to take the conservative route these days.+1 for Google allowing full opt-in GA data sharing because these Google Trend numbers are terrible in this first version.
With Google’s search data alone, this should be much more accurate than it is. Is it possible that Google is purposely not making it as accurate as it could be? Maybe for legal/liability reasons, perhaps to monetize a more accurate version down the line or just because they don’t want to share the data at this point in time?
How does google search data help solve this problem?Some of our portfolio companies (etsy) get less than 30pcnt of their traffic from searchOthers get upwards of 80pcntThat’s another kind of bias, right?
Agree with your initial point – need to understand each bias and triangulate. But Google has ability to be much more accurate than they appear to be, and their bias should not affect their ability to be directionally correct in most cases.For most types of sites, aggregate Google Analytics data could give Google a very good estimate of what percentage of total site traffic comes from Google search for each type of site, so that the bias can be somewhat accounted for. Another simple example is amount of search traffic from the site name itself – it is sometimes shocking to me the percentage of people that get to site x by Googling site x – but I bet aggregate data would show that this percentage is very consistent across different “buckets” of sites that Google could automatically build (and then calibrate with manual input and exact data sources like Analytics, AdSense and even other sources like Gmail, DoubleClick, etc. depending on each TOS). It seems to me that Google has a wealth of those types of data points to leverage, in addition to the sheer volume of data that they have.Not saying Google or anyone else can be 100%, and far less for the outliers, but I think they have enough internal data to algorithmically reduce their bias error to the point where they should be amongst the most accurate third-party measurement sources. If that isn’t the case, then I think it is interesting to consider if they have reasons for not publishing very accurate data at this point.
The even longer version for anyone also thinking about this.
Aren’t you comparing apples and oranges? Google Trends shows unique visitors that entered those sites via Google Search. The other data sources show total unique visitors, whether they came from Google Search, some other link referral, or as direct hits. Comscore offers a report that shows metrics by referral source (Source / Loss Report). If you ran that report where source = Google Search, that would be a better comparison. It’s possible, for example, that Twitter gets a large portion of it’s traffic from Google relative to Indeed.Update: Looks like I had this wrong. Google may also be an estimate on total uniques. I’m not clear on how they arrive at those numbers for sites that don’t use Analytics. Toolbar?
if that’s true Joe then it explains why trends for websites isn’t that usefulBut its still odd because indeed gets a fair amount of traffic from google search and twitter gets less on a percentage basisFred
Two thoughts:1) Google will give exact numbers if you log in (not sure why they need that).2) Google is giving a daily unique value, while Comscore and Compete are giving monthly uniques. It’s completely possible that Twitter has a lot of people that visit every day, while Etsy and Indeed have more people that visit only once per week.
Hmm. That’s an interesting thought
None of the web measurement services – trends.google.com/websites, compete.com, alexa.com (quantcast is the only exception) etc see to show any data on subdomains. So i can try yahoo.com on these services but none of them give me a breakup of different yahoo.com sites e.g. http://www.yahoo.com vs. my.yahoo.com vs. in.yahoo.com etc. So while I can look at the traffic of blogs.com, none of the web measurement sites will show me data on avc.blogs.com. I am hoping that by Google getting into the web measurement business we get to see more data being shared by these services.
I believe the only service that will give you data on avc.blogs.com is alexaAt least that’s the only service I’ve used to track this blog
Looks like Alexa does show avc.blogs.com but it does not allow me to compare in.yahoo.com vs. my.yahoo.com vs. http://www.yahoo.com vs. uk.yahoo.com. It is just one single number for all these properties.
The services are reporting different things: Google Trends “Daily Unique Visitors”, the other two services “Monthly Unique Visitors”, it seems (if I’m not misinterpreting labels like “People Counts – Monthly”).The huge disparity could be explained (for example) by indeed.com having a large number of non-repeat visitors (day-to-day). The effect would be especially pronounced when comparing to sites like Twitter, where a big fraction of one day’s visitors will visit again the day after.Update: Whoops, I should actually read the comments before posting …
Yes. Others have made that point. That does explain the difference. However, monthly uniques is the ‘standard’ for web traffic measurement and I wonder why google didn’t adopt it with this service
Google Trends gives different information depending on what you search for. It offers information on ” Searches ” (unique daily (?) queries) and ” Websites” (unique daily (?) visitors). See options for “Searches and “Websites” right under the query field.It also gives your different information depending on whether you search for a word or a domain. F.e., it will give you different resutls if you search for wikipedia, wikipedia.org , and wikipedia.com. (BTW, Puerto Rico seems to be a leader for searches for wikipedia.com — any ideas as to why? 🙂 )Try entering ‘ indeed, twitter’ , and click on “Searches” and ” Domains” and in separate tab, enter ‘ indeed.com, twitter.com ‘, you will see the difference.You may want to add ‘tweeter’ (completely different service), and you will see a very different comparative picture.
Thanks for this important research tool
Fred, are you comparing graphs from the right region? It looks like your Compete graph is U.S. only, while your Trends for Websites graph is from All Countries.US: http://trends.google.com/we…All countries: http://trends.google.com/we…
Not sure about competeI couldn’t figure that out
From http://siteanalytics.compet… …”Compete triangulates multiple data sources, including ISP, Panel & Toolbar to estimate U.S. traffic.”Given the similarity of the US graphs for Compete and Trends for Websites, it’s likely your Comscore graph is US only as well. That would make the 3 comparable for U.S. traffic.
The comscore data is worldwideFred
Google is nice :)We wrote some parameters to use it in SeoQuake (FireFox seo plugin), so you can see all Gtrends data just in SERP or when you surfing . (looks like alexa toolbar). You can get them here http://addons.seoquake.com/…You should be logged in to your google account to get data.
The Google Trends data is ONLY based on Google traffic.
Hi Fred, Google gives daily unique visitors whereas Compete and Comscore give monthly unique visitors I guess, so you have a scale problem with your graphs : if you take each site independantly, you will find that compete and google’s curvs have the same directions : http://www.u-lik.com/img/av…. So both services seem to be good, and google just says that twitter have >75% of its daily visitors that are addicts whereas etsy and indeed are <25% (meaning more unregular visitors), which seems pretty realistic to me. Is it the case ?