The emphasis is mine. I think that emphasis is important because the Farmer Update was not a content level algorithm change. Google did not suddenly get better at analyzing the actual quality of content. Instead they changed the algorithm so that the sites for which they felt were “low-value add for users” or “sites that are just not very useful” were demoted.
Manual Versus Algorithmic
I’ve been asked a number of times whether this was done manually. The simple answer is no. If it were done manually it would have been a lot more precise. Google did not, for example, take a list of the top 1000 low-quality sites and demote them. That would have been akin to a simple penalty. If they had done this you wouldn’t find sites like Digital Inspiration getting caught in the crossfire.
The complex answer is sort of. Algorithms are built by people. The results generated reflect the types of data included, the weight given to each type of data and how that data is processed. People at Google decide what signals make up the algorithm, what weight each signal should carry and how those signals are processed. Changing any of these elements is a type of human or manual intervention.
Until the Google algorithm becomes self-aware it remains the opinion of those who make up the search quality team.
Dialing For Domains
I absolutely believe Google when they say they didn’t manually target specific sites. Yet, they clearly had an idea of the sites they didn’t want to see at the top of the rankings.
The way to do this would be to analyze those sites and see what signals they had in common. You build a profile from those sites and try to reverse-engineer why they’re ranking so well. Then you look to see what combination of signals could be turned up, or down that would impact those sites.
Of course, the problem is that by turning the dials to demote these sites you also impact a whole lot of other sites in the process. The sites Google sought to demote do not have a unique fingerprint within the algorithm.
The result is a fair amount of collateral damage.
Addition By Subtraction
The other thing to note about the technique Google used was the idea that search quality would improve if these sites were demoted. Again, it’s not that other content has been deemed better. This was not an update that sought to identify and promote better content. Instead, Google seems to believe that by demoting sites with these qualities that search quality would improve.
But did it really?
You can take a peek at what the results looked like before by visiting a handful of specific IP addresses instead of using the Google domain to perform your search. So, what about a mom who is looking for information on school bullying.
Here’s what she’ll see now.
Wikipedia is the top result for school bullying. Really? That’s the best we have on the subject? Next up is a 20/20 piece from ABC. It’s not a bad piece, but if I’m a mom looking for this serious subject do I appreciate, among other things, the three links to Charlie Sheen articles.
What was the top result before the Google Farmer Update?
The Bullying Information Center from Education.com was the top result. That’s right, Education.com was one of the larger sites caught up in this change according to the expanded Sistrix data. Is the user (that mom) better served?
This is the tip of the iceberg. In fact it was the first one I tried, by simply using SEMRush to get a list of top terms for Education.com.
Throwing The Baby Out With The Bath Water
The last problem with the site technique is the inability for it to identify the diamonds in the rough. A content site composed of thousands of independent authors may have a varying degree of content quality. Applying a site wide demotion applies the same standard to all content regardless of individual quality.
The Google Farmer Update treats great content the same way as lousy content.
Shouldn’t we want the best content, regardless of source?
The Google Opinion
I’ve been vocal about how I think it’s presumptuous to believe that Google understands our collective definition of quality. Even in the announcement I find their definition of high-quality interesting.
… sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.
First, is this really what took the place of those ‘low-quality’ sites? As I’ve shown that’s not always the case.
But more to the point, do we all want research and in-depth reports? In the age of 140 characters and TL;DR, where time is of the essence and attention is scarce, is the everyman looking for these things?
And what makes an analysis thoughtful, or thoughtful enough? Is my analysis here thoughtful? I hope so, but I’m sure some people won’t agree.
Don’t Hate The Player, Hate The Game
Let me tell you a secret, I personally agree with what Google is trying to accomplish. There are sites I absolutely detest, that if I were king of the Internet would be removed from all search results. I wish people wouldn’t fall over themselves for the cliche, but effective, ‘list’ blog post. I want people to truly read, word-for-word, instead of scan and to think that people won’t look at a 1,000 word blog post (like this) as a chore.
I wish more people took the time (particularly in this industry) to download a research paper on information retrieval and actually read it. But guess what? That’s not how it works. Maybe that’s how I, and Google, want it to work. But that’s not reality. You know it and I know it.
So while I personally get where Google is coming from, I can’t endorse or support it.
From Library To Bookstore
The reason why I have been such a fan of Google is that they were close to neutral. The data would lead them to the right result. They were the library, allowing users to query and pick out what they wanted from a collection of relevant material. Google was the virtual catalog, an online Dewey Decimal System.
Today they seem more like an independent bookstore, putting David Mitchell, Philip Roth and Margaret Atwood out front while burying Stuart Woods and Danielle Steele in the stacks. Would you like an expensive ad latte with that?
Different Is Not Better
Did search quality get better? I tend to think not. It’s certainly different, but lets not confuse those two concepts.
By subjectively targeting a class of sites Google inadvertently demoted good sites along with bad sites, good content along with bad content.
We know 6 minus 3 doesn’t equal 10.