Google Doesn’t Trust Us

// // February 25th 2011 // Rant + SEO

Yesterday Google rolled out an algorithm change “designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful.”

Google’s share of the search market must have been suffering, right? Wrong. comScore puts Google at 63.0% in January 2009, 65.4% in January 2010 and 65.6% in January 2011. People were not defecting.

Not only that but Google is the leader in search engine customer satisfaction according to the American Customer Satisfaction Index. So why the change?

Google Means Well

I believe Google thinks they’re making things better. I don’t see a Machiavellian scheme behind every Google action. I like these guys. Meet any of the people at Google and you realize they’ve drunk deep from the search quality kool-aid. They are true believers! On top of that, they’re usually amiable and generous with their time.

Kool Aid

But … the road to hell is paved with good intentions.

Google Got Bullied

What’s shocking is how Google got pressured into making this change. Vivek Wadhwa, Paul Kedrosky, Jeff Atwood, Michael Arrington, Rich Skrenta and others played Google. A bunch of upper-class, highly-educated technophiles convinced Google that search quality was in jeopardy. Was search quality really an issue or was this a matter of taste?

lunatic fringe

A reminder, you can please some of the people all of the time and all of the people some of the time but you can’t please all of the people all of the time.

The Echo Chamber

A good marketer knows that they are not the target market. If you’re reading this, you are not the primary search user. You might be a power user, but you are in the minority my friend.

Perhaps there is more under the hood, but from where I sit Google chose qualitative feedback over quantitative feedback. The problem? That qualitative feedback was biased. The Silicon Valley echo chamber flexed its muscle and Google acquiesced.

Red Herring

What is disappointing is that Google decided to tackle the subjective (content quality) instead of the objective (link fraud). Do we truly think that JC Penney, Forbes and Overstock are outliers? The answer is an unquestionable no.

What’s a bigger threat to search quality? The blatant and rampant manipulation of trust and authority via link fraud or the creation of content (of varying quality) to meet query intent?

What Changed

A staggering 11.8% of queries were impacted by this algorithm change. I’m curious about how Google effected this change.

Did they re-weight current signals or create new signals? Google acknowledges that data from the Personal Blocklist Chrome extension was not used. That doesn’t mean other new signals or data weren’t used. But even if Google did introduce other new signals, to impact 11.8% of the queries it seems reasonable to believe that current signals were also re-weighted.

That assumption and hours of SERP review lead me to the following conjecture.

  • Trusted TLDs (org, gov, edu) were given more weight
  • Exact Match Keyword Domains were given more weight
  • Forums were given more weight
  • On-Site text was given more weight

The last presents itself in an odd way. Sites that look like they were last touched in 2003 are ranking well. It’s as if Google sought a ‘no style’ version of the web. This also includes a number of long form blogs. Sadly, many of these same sites are bloated with AdSense. Now, AdSense is everywhere so … that’s to be expected. But the position of the ad units on many of these sites is completely against any UX standard.

This is a very simplistic and blunt analysis. I’m sure others will tease out other differences and we’ll never know for sure what changed. But what it tells me is that Google changed quantitative measures to meet a pre-determined qualitative goal.

The Real Story

Google passed judgment on the quality and value of sites in what seems like a very subjective manner. How exactly did these sites and specific pages rank so well in the past? What suddenly changed? Did the pogosticking rate creep up? Did internal satisfaction metrics of the ‘reasonable surfer’ change? I’m not hearing any of that. I’m hearing subjective terms like ‘quality’, ‘value’ and ‘useful’ being thrown around.

Google is setting their own perceived metric of value in conflict with other signals, metrics and feedback. The message? Google doesn’t trust us to know any better. It’s not about what we want. It’s about what Google thinks we should want.

Skeptical Cat

The idea that Google altered current signals to effect a perceived content quality metric should terrify you.

It’s all very well and good when those changes don’t impact you. You guffaw at Mahalo’s demise. But what happens when they come for you? What happens when you’re suddenly the target? How will you feel when your content is called into question?

Postscript: Leave A Comment // Subscribe (RSS Feed)

The Next Post:
The Previous Post:

1 trackbacks/pingbacks

  1. Pingback: The Google Farmer Update Was About Sites, Not Content on February 27, 2011

Comments About Google Doesn’t Trust Us

// 9 comments so far.

  1. Miah // February 25th 2011

    I will tell you as someone who searches for technical queries 200-300 times a day that link and content mills have been creeping in on results more than ever. Many of these are technical forum posts copied as a question and answer “tech” site.

    Googles actions are probably related to a critical mass of “business” popping up offering this service churning out over time adds up to over 10% of search results. Now they will have to work harder.

    Shouldn’t you have to earn your rank by posting REAL comments on interesting sites 😉

  2. aj // February 25th 2011

    Miah, you’re right about the rise of content mills and Q&A sites. I’m all for penalizing (harshly) those sites that copy or scrape content. In addition, I think many of the sites you mention would never have ranked nearly as high if Google had addressed link abuse.

    The idea that targeted content alone can rank highly just isn’t true. So, I see this change as attacking the symptom of the problem and not the cause. Unfortunately, like a lot of medicine, it kills a lot of good things as it tries to kill the bad.

    Finally, I’m concerned that Google is becoming a nanny search engine. They’ve decided that certain content isn’t good for us, even if it was popular. I think that’s a dangerous precedent.

  3. Christopher Wulff // February 25th 2011

    They’re not becoming nannies, they’re becoming the librarians we need. Rather than taking me all over the place to find some badly photocopied version of the content I’m looking for they’re taking me directly to the source, to answer the question I had.

    The no-style version of the web is gaining popularity with everybody I know, and should be rewarded. I read everything through the filter of Reeder, Readability, and Instapaper, because the overwhelming majority of sites are full of images and text designed to distract me from the actual content I want to read/watch, etc. I for one am very glad to see them penalizing sites that don’t make their focus the easy consumption of the primary page content.

  4. aj // February 25th 2011

    Thanks for your comment Christopher. I really enjoy healthy debate and different opinions.

    I’d love if Google became librarians, but that’s not what I’m seeing. A library has everything available and lets the user find the content they desire. Libraries are generally neutral. Google acts far more like a bookstore, with that first page of results akin to the ‘best sellers’ rack at the front of the store. Google also papers the sides of those racks with ads, and it’s not for the local school bake sale.

    Again, I’m all for penalizing sites that scrape, copy and steal content. I’m a cycling fan so I like harsh penalties. Get caught copying content, you’re out of the index for a year.

    The readability of content is an interesting topic. I’m actually a big proponent of less being more. That creating readable content is vital to SEO. UX and (good) SEO are very close together in my mind. What I’m seeing ranking high doesn’t quite match that goal.

    I’m seeing things that remind me of geocities. There’s no font hierarchy. Instead it’s long blocks of content that I have to think will chase users away. I’m seeing more ads, not less, and I’m seeing them in truly obnoxious places. In one instance a site that jumped one of my clients for a term has no content above the fold but instead has a massive AdSense unit.

    So I think we’re in agreement that sites that allow the reader to better engage with the content should be rewarded. I just don’t think this Farmer Update did that.

    There’s also an interesting discussion to be had on how sites can support themselves. By tuning out and turning off the advertising, we make it harder for those sites to stay in business. Would you pay pennies per page view to content creators so they could continue to publish without intrusive ads?

  5. Slartibartfast // February 25th 2011

    I’ve been looking around for a search alternative for months. Just because you are the best game in town, doesn’t mean you are doing the best job.

    Google alerts is completely broken thanks to the content farms. Any step in the right direction is better than nothing.

  6. aj // February 25th 2011

    True. Just because you’re the best doesn’t mean you can’t get better. But I’m unsure if this step was is in the right direction. Two wrongs don’t make a right.

  7. Christopher Wulff // February 25th 2011

    Thanks for your notes AJ. I appreciate your insights on the specifics you’ve witnessed thus far. I’m not nearly as engaged with the SEO and search specifics so in my space, where I’m often looking for programming or design specifics, I am overrun with scrapers and such that make it almost impossible to find the original resource where questions can be asked and conversation furthered.

    Maybe it’s a quality of Toronto libraries, but they’re, thankfully, not that neutral. The rack at the front almost never has ‘popular’ fiction on it. No Grisham, no Danielle Steel, no Robert Jordan, none of that stuff, let alone a lot of the junk that never makes it into libraries to begin with. They put quality resources up front, and bury the other stuff behind it for you to find if you want to go looking. Librarians make assumptions that I am looking for the most accurate, most reputable, best peer reviewed resource to answer my question, an assumption I’m glad to have them make.

    It sounds like your experience with this change suggests that Google has a long way to go if their goal is to bring the most accurate, most specialized and most reputable resource to the top (and it’s interesting that in some ways I guess in talking about librarians I’m talking like the old Yahoo model of handbuilding search results, hmm… further consideration necessary). But I think this step is one in the right direction and places the focus of search squarely where it should be.

    Love the “steal content, get banned” proposal. We need to return to an understanding of why the hell we created hyperlinks to begin with-connect people to the original source rather than copying it.

    More Geocities style content will surely make me weep, so I hope that that finding is an aberration that will be remedied. As HTML5 and more semantic content comes into place, those headings and hierarchies will be better understood by the robots I expect and those with proper information hierarchies will be rewarded.

    Thankfully, with Readability I can pay the pennies per page view (though people gaming a ppv system with unnecessary links to generate pageviews will piss me off something fierce). It’s actually the part that I don’t understand about those sites-why haven’t more content sites simply blocked them, since I’m no longer seeing the ads that pay for the content and instead paying Instapaper for the app to block them. Of course, Google’s Mobilizer does pretty much the same thing. It’s an odd omission in the litigious world that I can, to draw an analogy, cut an article out of your magazine (without paying for it) and sell the opportunity to read it to someone else.

    Well, and I have the adblockers running pretty well too for that matter, so I’m not really paying ad heavy pages anyway. As a note, I don’t block ads on sites where the ads are actually for things I might be interested in. Curated ads for a narrow audience outperform all of the automated systems if you’re trying to get me to buy something.

  8. Lazy Man and Money // February 25th 2011

    “They’ve decided that certain content isn’t good for us, even if it was popular.”

    I didn’t see anything about Google taking popularity into account at all. However, they could. They have a lot of this data from sites running analytics.

    I think the opening argument about Google’s growing dominance misses the mark. Just because something is getting more popular, it doesn’t mean that the product is getting better – or even that it’s a good product. Perhaps the growth of Chrome is leading to the increased traffic.

    People aren’t defecting because there’s really not a great place to defect to. It seems like Bing is the only viable option. (Yahoo search results will be Bing at some point, so I’m discounting that).

    As for the library vs. the bookstore argument, everyone knows that the first page of the Google results is a best seller’s rack. I don’t think that a bad thing. If you do think it’s a bad thing, what would you do to fix it? Should Google just present hits in random order or alphabetical by author? That’s not very helpful to me. I much prefer the Amazon method of being able to sort on customer reviews or best selling.

    I don’t think Google got bullied. The arguments that were made about the content farms creeping up in results can be objectively seen and measured.

    Finally, Google has been working on this for over a year – before anyone was complaining. They used an algorithm to create what seems to be better results. This matches with the data that they are getting from Chrome users. I don’t see how this is different than any other algorithm change they’ve made to improve their search results. I’d be much more nervous if they were using the Chrome data directly as you could vote down your competitors.

  9. DigESource // May 01st 2011

    Policing content is something Google should NOT do. Policing RELEVANCY IS! Lets not forget that Google is using an automated system to tell everyone else NOT to use automated systems. Google is rolling out it’s value system for us. Here comes the will of God, as Google sees it!

Sorry, comments for this entry are closed at this time.

You can follow any responses to this entry via its RSS comments feed.

xxx-bondage.com