The internal metric Google uses to determine search success is time to long click. Understanding this metric is important for search marketers in assessing changes to the search landscape and developing better optimization strategies.
Short Clicks vs Long Clicks
Back in 2009 I wrote about the difference between short clicks and long clicks. A long click occurs when a user performs a search, clicks on a result and remains on that site for a long period of time. In the optimal scenario they do not return to the search results to click on another result or reformulate their query.
A long click is a proxy for user satisfaction and success.
On the other hand, a short click occurs when a user performs a search, clicks on a result and returns to the search results quickly to click on another result or reformulate their query. Short clicks are an indication of dissatisfaction.
Google measures success by how fast a search result produces a long click.
Bounce Rate vs Pogosticking
Before I continue I want to make sure we're not conflating short clicks with bounce rate. While many bounces could be construed as short clicks, that's not always the case. The bounce rate on Stack Overflow is probably very high. Users search for something specific, click through to a Stack Overflow result, get the answer they needed and move on with their life. This is not a bad thing. That's actually a long click.
You can gain greater clarity on this by configuring an adjusted bounce rate or something even more advanced that takes into account the amount of time the user spent on the page. In the example above you'd likely see that users spent a material amount of time on that one page which would be a positive indicator.
The behavior you want to avoid is pogosticking. This occurs when users click through on a result, returns quickly to the search results and clicks on another result. This indicates, to some extent, that the user was not satisfied with the original result.
Two problems present themselves with pogosticking. The first is that it's impossible for sites to measure this metric. That sort of sucks. We can only look at short bounces as a proxy and even then can't be sure that the user pogosticked to another result.
The second is that some verticals will naturally produce pogosticking behavior. Health related queries will show pogosticking behavior since users want to get multiple points of view (or opinions if you will) on that ailment or issue.
This could be overcome by measuring the normal pogosticking behavior for a vertical or query class and then determining which results produce lower and higher than normal pogosticking rates. I'm not sure Google is doing this but it's not out of the question since they already have a robust understanding of query and vertical mapping.
But I digress.
Part of the way Google works on reducing the time to long click is by improving the speed of search results and the Interent in general. Their own research showed the impact of speed on search results.
All other things being equal, more usage, as measured by number of searches, reflects more satisfied users. Our experiments demonstrate that slowing down the search results page by 100 to 400 milliseconds has a measurable impact on the number of searches per user of -0.2% to -0.6% (averaged over four or six weeks depending on the experiment). That's 0.2% to 0.6% fewer searches for changes under half a second!
Remember that while usage was the metric used, they were trying to measure satisfaction. Making it faster to get to information made people happier and more likely to use search for future information requests. Google's simply reducing the friction of searching.
But it's not just the speed of presenting results but in how quickly Google gets someone to that long click that matters. Search results that don't produce long clicks are bad for business as are those that increase the time selecting a result. And pogosticking blows up the query timeline as users loop back and tack on additional seconds worth of selection and page load time.
Make no mistake. Google wants to reduce every portion of this timeline they presented at Inside Search in 2011.
One of the ways in which we've seen Google reduce time to long click is through various 'answers' initiatives. Whether it's a OneBox or a Knowledge Graph result the idea is that answers can often reduce the time to long click. It's immediate gratification and in line with Amit Singhal's Star Trek computer ideal.
In some of cases a long click is measured by the absence of a click and reformulated query. If I search for weather, don't click but don't take any further actions, that should register as a long click.
You'll also hear Google (and Bing) talk about the fact that ads are answers. Of course ads are what fill the coffers but they also provide another way to get people to a long click. Arguing the opposite (that ads aren't contributing to satisfaction) is a lot like arguing that marketers and advertisers aren't valuable.
Not only that, but Google has features in place to help ensure that good
ads answers rise to the top. The auction model coupled with quality score and keyword level bidding all produce relevant ads that lead to long clicks.
The analysis of pixel space on search results is often used to show how Google is marginalizing organic search. Yet, the other way to look at it is that advertisers are getting better at delivering results (with the help of new Google ad extensions). Isn't it, in some ways, man versus machine? The advertiser being able to deliver a better result than the algorithm?
Without doubt Google benefits financially from having more space dedicated to paid results but they still must result in long clicks for Google to optimize long-term use, which leads to long-term revenues and profits.
I would be very surprised if changes to search results (both paid and organic) weren't measured by the impact they had in time to long click.
All of this is interesting but what does the time to long click metric mean for SEO? More than you might suspect.
When I started in the SEO field I read everything I could get my hands on (which is not altogether different from now). At the time there was advice about becoming a hub.
There was a good deal of hand waving about the definition of a hub but the general idea was that you wanted to be at the center of a topic by providing value and resources. People would link to you and the traffic you received would often go on to the resources you provided. About.com is a good example.
Funny thing is, this isn't some well kept secret. Marshall Simmonds spells it out pretty clearly in this 2010 Whiteboard Friday video where he discusses bow tie theory (hubs) and link journalism. (I just watched this again while writing this and, man, this is an awesome video.)
Most people focus on the fact that hubs receive a lot of backlinks. They do because of the value they provide, which is often in the aggregation of and links to other content. In the end, the real value of hubs is that they play an important part in getting people to content and that long click.
Search is a multi-site experience.
This is what search marketers must realize. You will get credit for a long click if you're part of the long click. If you ensure that the user doesn't return to search results, even by sending them to another site, then you're going to be rewarded.
Too often sites won't link out. I regularly run into this as my clients navigate business development deals with partners. It's frustrating. They think linking out is a sign of weakness and reduces their ability to consolidate Page Rank.
While Page Rank math might support not linking out, that strategy ultimately limits success.
Limiting your outlinks creates a local maxima problem. You'll optimize only up to a certain ceiling based on constrained Page Rank math. Again, not a real secret. Cyrus Shepard talked about this in a 2011 Whiteboard Friday video (though I wouldn't stress too much about the anchor text myself.)
Linking out can help you break through that local maxima by delivering more long clicks. Suddenly, your page is a sort of mini-hub. People search, get to your page and then go on to other relevant information.
Google wants to include results that contribute to reducing the time to long click for that query.
I'm not advocating that you vomit up pages with a ton of links. What I'm recommending is that you link to other valuable sources of information when appropriate so that you fully satisfy that user's query. In doing so you'll generate more long clicks and earn more links over time, both of which can have profound and positive impact on your rankings.
Stop thinking about optimizing your page and think about optimizing the search experience instead.
I ran into someone as SMX West who inherited a vast number of low quality sites. These sites used the old technique of being relevant enough to get someone to the page but not delivering enough value to answer their query. The desired result was a click on an ad. Simple arbitrage when you get down to it.
In a test, placing prominent links to relevant content on a sub-set of these pages had a material and positive impact on their ranking. It's certainly not conclusive, but it showed the potential impact of being part of a multi-site long click search result.
As an aside, it's not that those ad clicks were bad. Some of those probably resulted in long clicks. Just not enough of them. The majority either pogosticked to another result or wound up back at the search result after an ad click. And we already know this as search marketers by looking at the performance of search versus display campaigns.
Impact On Domain Diversity
If you believe time to long click is the way in which Google is measuring search success then you start to see some of the changes in a new light. I've been disappointed by the lack of domain diversity on many search results.
Sadly, this type of result hasn't been that rare within the last year. Pete Myers has been doing amazing work on this topic.
For a while I just thought this was Google being stupid. But then it dawned on me. The lack of domain diversity may be reducing the time to long click. It might actually be improving the overall satisfaction metrics Google uses to optimize search!
In some ways this makes a bit of sense, if even from a straight up Paradox of Choice perspective. Selecting from 10 different domains versus 5 might reduce cognitive strain. Too many choices overwhelm people, reducing both action and satisfaction. So perhaps Google's just reflecting that in their results with both domain diversity (or lack there of) and more instances of 7 results pages.
Downsides to Time To Long Click?
Are these long clicks are truly a sign of satisfaction. The woman who had been cutting my hair for nearly 10 years retired. So I actually did need to find someone new. I hated search result but did wind up clicking through and using Yelp to locate someone. So from Google's perspective I was satisfied but in reality ... not so much.
I wonder how long a time frame Google uses in assessing the value of long clicks. I abandoned my haircut search a number of times over the course of a month. In many of those instances I'm sure it looked like I was satisfied with the result. It looked like a long click. Yet, if you looked over a longer period of my search history it would become clear I wasn't. I think this is a really difficult problem to solve. Is it satisfaction or abandonment?
The other danger here is that Google is training people to use another service. Now, I don't particularly like Yelp but what this result tells me is that if I wanted to find something like this again I should just skip Google and go right to Yelp instead.
The same could be said by reflecting our own bias toward brands. While users may respond better to brands and the time to click might be reduced, the long term implications could be that Google is training users to visit those brands directly. Why start my product search on Google when all they're doing is giving me links to Amazon 90% of the time?
Of course, Google could argue that it will remain the hub for information requests because it continues to deliver value. (See what I did there?)
Google is using time to long click to measure the effectiveness of search results. Understanding this puts many search changes and initiatives into perspective and gives sites renewed reason to link out and think of search as a multi-site experience.