You Are Browsing The Rant Category

Recovering From The Weaponization of Social Media

November 13 2022 // Rant + Social Media + Technology // 2 Comments

Earlier this year Jon Henshaw gushed over federated social. I didn’t quite get it but I managed to create a Mastodon account, which I summarily abandoned until Elon Musk purchased Twitter.

The purchase pushed me to figure out Mastodon (not as hard as you might think) and led me to realize what I’d been missing and just how corrosive current social networks had become. I’ve concluded that today’s social media is a bad batch of Soma made from Soylent Green.

Social Media

Disappointed Basketball Fan Meme

I’d usually start out with a definition to help ground the discussion. But there isn’t a canonical definition of social media from what I can tell.

There’s this definition of social media from dictionary.com.

websites and other online means of communication that are used by large groups of people to share information and to develop social and professional contacts

Then there’s Wikipedia’s definition.

interactive media technologies that facilitate the creation and sharing of information, ideas, interests, and other forms of expression through virtual communities and networks.

Mind you, they’re not that dissimilar. But the first doesn’t include creation of content and the second doesn’t mention the development of contacts. In the end, I find Wikipedia’s definition to be more descriptive of today’s social media environment.

I see more and more people creating content on Twitter and LinkedIn. Twitter incentivizes this by enforcing a character limit. Yeah, yeah, you can create threads but that’s an outlier. And all of the platforms work very hard to keep the conversation about any content on their platform.

Did we just acquiesce to the idea of digital sharecropping? Do we even think about this anymore? Bueller. Bueller. Bueller.

I also don’t see most of these platforms as helping to develop contacts. They help develop a following or a fan base. You are not developing Hank Green as a contact when you follow him on Twitter. You are really not connecting with him either.

Sure, LinkedIn still pushes folks to develop contacts. But I can’t be the only one who’s had a conversation go as follows:

Them: I see you’re connected to so-and-so, how do you know them?

Me: Who?

Them: so-and-so

Me: *typing name into LinkedIn search and seeing we are, indeed, connected*

Also Me: *I have no recollection of this person*

Both of these definitions make social media seem like a utopian exchange of ideas. The vaunted town square! Does this jibe with what you encounter everyday? Or are you instead served up influencer content with a side helping of targeted ads based on your behavior?

Algorithms and Influencers

I'm Special

I still believe that Dunbar had it right, and that the number of social relationships one can maintain is not that high. Technology might be able to extend the number past 150. But it certainly doesn’t reach 1.4 million.

Follower and following numbers are meaningful only to the algorithm, allowing it to understand who might provoke a reaction and, thereby, create monetary value. This is why we have influencers. Because social media isn’t really about being social anymore. It’s about people.

Sure they needed content to get there but more and more of these platforms want you to create that content on the platform – whether that’s Twitter, Instagram or TikTok. From there the only way to profit from that system is to amass a large enough following.

Algorithms create a competitive incentive to produce content so users can appear in topics that produce monetizable engagement. It’s a lot like going to the casino. Most people lose. A few (influencers) win. But the house (platform) always benefits.

This still wouldn’t be a problem if the content that produced engagement wasn’t, for lack of a better word, toxic.

Twitter

This is Fine Meme but with Twitter Bird instead of Dog

Let me say up front that Twitter was very good to me when I was building my brand. But it was a different time and if I were to build a brand today I think I’d lean on LinkedIn more so than Twitter.

I started to use Twitter a lot less three to four years ago. Even during the pandemic my usage stayed about the same. Part of this was because of the way I used Twitter in the first place. I used it for business because I always found Twitter to be more akin to a megaphone. It didn’t really encourage conversation. But it sure helped to get your brand out to more people!

So I cultivated a presence that was about sharing the best of what I saw in my industry. I may not have made a lot of friends during that time. I frequently ignored requests to Tweet posts from colleagues. I was a content snob. But I think that made what I did share that much more valuable.

I found less and less to share over time. Granted, I had less time to read, but what I did read was, to put it nicely, not inspiring. So my Tweets ground down to a trickle. But that doesn’t mean I wasn’t lurking.

I didn’t lurk in my industry. I find SEO squabbles to be unimaginative and dull. Instead I’d look in at the trends, particularly during these tumultuous times. Man is that bad for your health.

I’d see Scott Baio trending and I’d click to see what the fuss was all about. Of course he’d said something stupid. And I’d see few zingers confirming my viewpoint. But then I’d see others defending him and, in today’s parlance, I was triggered.

Someone is wrong on the Internet

Comic from XKCD

This statement is essentially the raison d’etre of Twitter. They create a feedback loop of tribalism.

Twitter became a place to be performative. What witticism or burn could rack up the most Retweets? Twitter became social for sport.

All social media companies are piggybacking on the human desire to connect. Star Trek: Discovery explores this theme endlessly and poorly. (I mean, honestly, you’d be in the hospital with alcohol poisoning if you drank every time they mentioned ‘connection’.) But being on Mastodon, I realize that I forgot how satisfying it is to forge those new connections.

Facebook is no angel either. You can clearly go down the same rabbit hole there than you can on Twitter. But for me Facebook is rather tame because I use it largely to keep up with family, old classmates and to see heartwarming The Dodo videos. But again, that’s only because I’ve been aggressive in not engaging elsewhere in that ecosystem.

I may post an update about the current state of things or comment on a political thread now and then. But I generously block people. In fact, it’s gotten to the point where I don’t engage in those posts anymore but simply go and block the people arguing in them instead.

There are other, more subtle dangers surrounding Facebook. I believe that maintaining these connections from the past often hold us back from moving forward. I am not the same person I was in high school. I’m not the same person I was in college. Not the same person I was even a year ago. Yet Facebook encourages us to continue to interact with the cohort of people we knew in these eras.

I’m not saying you might not have lifelong friends that you keep up with. I’m an introvert so I likely have fewer of them than some of you. But how many different careers have you had? I’ve been in advertising, fundraising and marketing. Should I still be trying to engage and keep up with colleagues in each of these careers?

I wonder whether Facebook has stunted the growth of people by having them continuously looking backward instead of forging ahead.

Mastodon

This is Fine Mastodon

I won’t go into the details of federated social and how Mastodon is structured. Instead, I want to talk about what it feels like and what it inspires. Using Mastodon is like going back to the early days of social.

I was a big fan of FriendFeed, a social platform that most of you probably didn’t use or have even heard of before now. It was similar to Mastodon in that it had a diverse set of people from all walks of life. And from what I recall it didn’t really have an algorithm to present content. Instead it was up to you to follow the people who would help bring good stuff into your feed.

I liked FriendFeed so much that I ventured out to meet the team on the Peninsula during one of their events. That was a huge step for this introvert. I felt super-awkward at that event but it was interesting to meet and chat (or witness them chatting) with Paul Buchheit and Bret Taylor.

But here’s the thing. FriendFeed went the way of the Dodo bird and it’s not lost on me that Bret Taylor wound up at Facebook and, ultimately, on Twitter’s board.

The only other social platform that felt similar was Google+. Once again, that platform worked primarily based on the quality of those you followed. I didn’t need to follow everyone in the SEO community. I just needed to follow those that would share the important posts they found.

What both of those platforms had in common, in my view, was that users were in charge of configuring their own ‘algorithm’. Who you followed shaped your feed. So following became less about being nice and doling out a pellet of ego to others than to simply enhance your own feed.

Who you followed was a very selfish endeavor. It was about me, and what I saw because of them.

Configuration and Ego

Configuration can be confusing

Two things spring from this different dynamic. First is that adding configuration to any platform is going to make it more difficult. I’ll call it like I see it, Mastodon is more difficult. It’s easier to let an algorithm learn what you like and return relevant content based on your explicit and implicit tastes.

The second is that without an algorithm surfacing content, the number of people you follow will likely, and should, shrink. Mastodon is more about the content people share rather than the people who share it.

Lately I’ve seen the idea that the ‘users are the product’ on social media platforms. And that works when they’re all competing to be the chosen one to be featured by the algorithm. But I see great content on Mastodon by following a user who has, at the time of writing, 114 followers. It is meaningless how many people follow them. I follow them because they put interesting content into my feed. If they stop doing so I’ll probably unfollow them.

Will enough people be willing to configure their feed without an ego based algorithm fueling competition? The only platform that comes close is Reddit. In many ways you can think of subreddits as an analog to a Mastodon server/instance with the home feed aggregating those instances.

Reddit is still working to monetize their ecosystem. Oddly, there’s a cottage industry out there that takes Reddit threads and turns them into articles that can be monetized. It’s annoying for Reddit at the corporate level but potentially a solid signal for the community.

I say this because it means that the discussions on Reddit are happening not because it will produce monetizable engagement but despite it.

The bigger problem with Mastodon is the lack of ego. Don’t get me wrong, some of it will always exist in one form or another. We’re only human. But it is a content first universe with no real rewards for popularity.

In fact, to get geeky, popularity could produce some problems given the way Mastodon and federated social works. A person with a large following could tax smaller servers since both sides need to retrieve that content. For that reason, there’s already an admonishment to not upload video.

But I digress. We’ve become inured to social media that rewards tribal content from influencers. We celebrate when we pass a milestone for number of followers or subscribers. I’m not saying I’m immune to those things. It feels good to be … validated. But I’m always on guard to be sure I don’t produce or share content to reach those metrics.

But it might be harder for others to let go of those vanity metrics. I’m not saying it’s going to be easy. I’m saying it’s going to be worth it.

Serendipity

One Thing Leads to Another by The Fixx

If you can, I think you’ll find that Mastodon is more positive, more diverse and produces more real interaction. I won’t say it’s the best place to have a full-blown conversation yet. It might never be.

Instead it’s like having an interesting talk with a stranger you meet on an airplane. You might not keep in touch but you walk away feeling positive and amazed by the world we live in and how different but alike we all are at the same time.

Instead of being force fed content about Kanye, Kate Middleton or Kyrsten Sinema I find information about CRISPR advances, a fascinating story about the Battle of Midway and an amazing painting of an octopus on the underside of a hotel table.

These things make me happy and also help me make disparate connections with other things in my life and work. Serendipity is a feature and not a bug.

TL;DR

Social Media doesn’t have to be dictated by algorithms that reward influencers who produce divisive and monetizable content. But alternatives like Mastodon that are more about the content people share than the people who share it will require more work by users both technically and personally.

The Problem With Image Search Traffic

November 14 2019 // Analytics + Rant + SEO // 11 Comments

Where To Track Image Search Traffic

Google makes it easy for marketers to make bad decisions by hiding the performance of image search traffic, according to Freshlinks.

Marketers have grown accustomed to not seeing image search traffic broken out in analytics packages. And Google persists in telling marketers to use Google Search Console to track image search traffic. Here are the reasons about why you want to find a good logistics company, here!

The problem? Google Search Console doesn’t tell marketers how image search traffic performs.

Here’s why Google’s decision to hide image search traffic performance is hurting websites.

Image Search History

Google Analytics doesn’t track image search as a separate source of traffic. This never made any sense to me.

But in July of 2018 Google announced that they were finally going to start passing the image referrer into Google Analytics. I was, in all honesty, elated that we’d finally have image search split out.

So I waited. And waited. And waited. And waited. And waited. And then, very quietly, Google updated that post.

Google Decides To Give Us Bad Data

WTF! “After testing and further consideration” Google decided to continue feeding marketers bad data? I cursed like a sailor. Multiple times.

Even worse? They pointed marketers to the Search Console Performance Report. Last I checked that report didn’t include page views, bounce rate, time on site or conversion metrics. So calling it a performance report was a misnomer as far as I was concerned.

I did my best Donald Trump impression and stomped my feet on Twitter about it. Nothing came of it. No one seemed to care. Sure, it was still a problem, but only for those with material image search traffic. I knew what to look for and … I was busy.

So what changed? Two things happened that made me write this piece.

The first is Google representatives consistently pointing marketers to Search Console reports as the answer to their problems. This triggers me every time. Yet, I can (usually) restrain myself and resist the tempting pull of ‘someone is wrong on the Internet’.

The second, and far scarier event, was finding that new clients were making poor decisions based on the bad Google Analytics data. Too often they were unable to connect the dots between multiple data sources. The fate of projects, priorities and resources were at stake.

Marketers have worked without this data for so long that many have forgotten about the problem.

Let me remind you.

Image Search Tracking

Google Analytics Y U No Track Image Search

Out of frustration I figured out a way to track image search in Google Analytics. That was in 2013. Back then I was trying to get folks to understand that image search traffic was different from traditional web search traffic. And I could prove it with those Google Analytics advanced filters.

Image Search by Browser

Unfortunately, soon after that post in 2013 we began to lose visibility as more and more browsers failed to capture the image search referrer.

Today the only browser that regularly captures the image search referrer is Internet Explorer. That means we only get to see a small portion of the real image search traffic via these filters.

Clearly that introduces a fair amount of bias into the mix. Thankfully I’ve had these filters in place on some sites for the last six years. Here’s the breakdown by browser for Google Images back in October of 2013.

Image Search by Browser October 2013

There’s a nice distribution of browsers. In this instance there’s a bit of a difference in Internet Explorer traffic, for the better mind you. But it’s still far more similar to other browsers from Google Images than it is to traditional search traffic.

Now here’s the breakdown by browser for Google Images from October of 2019 (from the same site).

Image Search by Browser October 2019

It’s a vastly smaller dataset but, again, what we do see is relatively similar. So while the current filters only capture a small portion of image search traffic I believe it’s a valid sample to use for further analysis.

Image Search Performance

Once you have those filters in place you instantly see the difference. Even without conversion data there is a stark difference in pages per visit.

Image Search Performance Comparison

That’s a look at October 2019 data from a different site. Why am I using a different site? It has more data.

Think I’m hiding something? Fine. Here’s the same data from the first site I referenced above.

image-search-pages-per-session-difference-again

The behavior of image search traffic is very different that web search traffic.

Think about how you use image search! Is it anything like how you use web search? The intent of image search users differs from that of web search users.

Why does Google think we should treat these different intents the same?

Image Search Conversion

Things get more interesting (in a Stephen King kind of way) when you start looking at conversion.

eCommerce Image Search Conversion Rate

This is a large set of data from an eCommerce client that shows that image search traffic does not convert well. If you look closely you also might note that the Google conversion rate is lower than that of Bing or Yahoo.

For those squinting, the conversion for Google is 1.38% while Bing and Yahoo are at 1.98% and 1.94% respectively. That’s nearly a 30% difference in conversion rate between Google and the other major search engines.

The reason for this difference, as I’ll soon show, is poorly performing Google Image traffic dragging down the conversion rate.

Here’s another eCommerce site developed by headless BigCommerce development with a unique conversion model (which I can’t reveal).

Image Search Conversion Rates

In this instance, Google Images performs 64% worse (.17%) than Google (.47%). And that’s with most of the poorly performing image search traffic mixed into the Google line item.

Over the last 28 days Google Search Console tells me that 33.5% of Google traffic is via image search. The distribution above shows that 5.8% comes from image search. So the remaining 27.7% of the Google traffic above is actually image search.

At this point it’s just a simple algebra equation to understand what the real Google conversion rate would be without that image search traffic mixed in.

Image Search Conversion Math

Confused Math Lady

Don’t be scared away by the math here. It’s really not that hard.

First I like to say it as a sentence. If total traffic of 88,229,184 has a conversion rate of 0.47%, but 27.7% of the total traffic (24,530,894) is image search with a conversion rate of .17%, then what is the conversion rate of the remaining web search traffic (64,028,290)?

Then it becomes easier to write the equation.

24,530,894*0.17 + 64,028,290 * X  = 88,229,184 * 0.47

At that point you solve for X.

4,170,252 + 64,028,290X = 41,622,816

64,028,290X = 41,622,816 – 4,170,252

64,028,290X = 37,452,565

X = 37,452,565/64,028,290

X = 0.58

That means the true difference in conversion performance is .17% versus .58% or nearly 71% worse.

Organic Search Conversion Deflation

Including image search traffic into organic search decreases the overall conversion rate. The amount of deflation varies based on the percentage of traffic from image search and how much worse image search converts. Your mileage may vary.

Here’s another example of how this might play out. Here’s the conversion rate trend for an eCommerce client.

conversion-rate-trend

They’ve been concerned about the continuing decline in conversion rate, despite material growth (60%+) in traffic. The drop in conversion rate between July 2018 and October of 2019 is 38%.

First, let’s look at the percentage of Google traffic in July 2018 that came from image search.

Image Search Share of Traffic July 2018

I don’t have a whole month but the ratio should hold about right. In July 2018 the share of Google traffic from image search was 30.2%.

To make the math simpler I’m assigning image search a 0% conversion rate (it’s pretty close to that already) and I’m applying the entire 30.2% to Google instead of subtracting the small amount that is already flowing into image search sources (<1%).

Adjusted Conversion Rate July 2018

When you do the math Google suddenly has a 2.19% conversion rate, which puts it in line with Bing and Yahoo. Funny how that works huh? Actually it’s not funny at all.

Seriously folks, I want you to fully digest this finding. Before I removed the Google Image traffic the conversion rate of the three search engines is:

Google: 1.51%

Bing: 2.21%

Yahoo: 2.23%

But when I remove Google Image search traffic the conversion rate of the three search engines is:

Google: 2.19%

Bing: 2.21%

Yahoo: 2.23%

When image search traffic is removed the conversion data makes sense. 

You know what else happens? Paid Search doesn’t look nearly as dominant as a conversion channel.

Paid Search Conversion July 2018

So instead of organic search being nearly half as effective (1.55% vs 2.97%) it’s approximately 75% as effective (2.19% vs 2.97%).

But look at what happens when we analyze October of 2019. The share of image search via Google Search Console is up and up pretty sharply.

Image Search Share of Traffic October 2019

Now, 44.8% of the Google traffic to this site is from image search. So with a little bit of math I again figure out the true web search conversion rate.

Adjusted Conversion Rate October 2019

Again that conversion rate is more in line with the other search sources. (Though, note to self, investigate Bing conversion drop.)

Paid search conversion also dropped to 2.25% in October of 2019. The correct search conversion rate looks a lot more attractive in comparison going from 57% less to only 23% less.

Let me restate that.

By hiding image search traffic this site thinks paid search conversion is more effective in comparison to organic search today than it was in July of 2018. The reality is the opposite. In comparison to paid search, organic search conversion improved slightly.

Mix Shift Issues

Sir Mix-A-Lot

If we go back to that trend at the beginning of the prior section, the drop in conversion from July 2018 to October 2019 is no longer 38% but is approximately 21% instead. That’s still a material drop but it’s not 38%!

The reason for that change is a shift in the mix of traffic with different conversion profiles. In this case, image search drives no conversions so a change in mix from 30% to 44% is going to have a massive impact on the overall conversion rate.

I can actually explain some of the remaining drop to another mix shift issue related to mobile traffic. Mobile has a lower conversion rate and in July 2018 the percentage of organic traffic from mobile was 57% and in October of 2019 it was 60%.

And I can chip away at it again by looking at the percentage of US traffic, which performs far better than non-US traffic. In July 2018, US traffic comprised 53% of Google search traffic. In October 2019, US traffic comprised 48% of Google search traffic.

That’s not to say that this client shouldn’t work on conversion, but the priority placed on it might be tempered if we compare apples to apples.

And that’s what this is really about. Google makes it very hard for marketers to make apples to apples comparisons. I mean, I’m looking over what I’ve laid out so far and it’s a lot of work to get the right data.

Alternate Image Search Tracking

Walternate from Fringe

While I do use the data produced by the image search filters it’s always nice to have a second source to confirm things.

Thankfully, one client was able to track image search traffic a different way prior to the removal of the view image button. What did they find? The image search conversion rate was 0.24% while the web search conversion rate was 2.0%.

Yup. Image search performed 88% worse than web search.

This matters for this particular client. Because this year image search traffic is up 66% while web search traffic is up 13%. How do you think that translates into orders? They’re up 14%.

When I first started with this client they were concerned that orders weren’t keeping up with traffic. Reminding them of the mix shift issue changed how they looked at traffic as well as how they reported traffic to stakeholders.

Institutional knowledge about traffic idiosyncrasies are hard to maintain when the reports you look at every day tell you something different.

Bad Data = Bad Decisions

No Regerts Tattoo

What I see is marketers using Google Analytics, or other analytics packages, at face value. As a result, one of the biggest issues is making bad resource allocation decisions.

Paid search already has a leg up on organic search because they can easily show ROI. You spend X and you get back Y. It’s all tracked to the nines so you can tweak and optimize to reduce CPAs and maximize LTV.

Organic search? Sure we drive a ton of traffic. Probably a lot more than paid search. But it’s hard to predict growth based on additional resources. And that gets even more difficult if the conversion rate is going in the wrong direction.

So management might decide it’s time to work on conversion. (I swear I can hear many heads nodding ruefully in agreement.) Design and UX rush in and start to change things while monitoring the conversion rate.

But what are they monitoring exactly? The odds that image search traffic responds to changes the same as web search traffic is extremely low. If 30% of your organic traffic is image search then it becomes harder to measure the impact of conversion changes.

Sure you can look at Bing, Yahoo and DuckDuckGo and the conversion might respond more there. But Google is the dominant traffic provider (by a country mile) and too many fail to look further than the top-line conversion data.

A/B Testing?

Villanelle Wants You To Be Quiet

Oh, and here’s a brainteaser for you. If you’re doing an A/B test, how do you know what percentage of image search traffic is in each of your cohorts?

Yeah, you don’t know.

Sure, you can cross your fingers and assume that the percentage is the same in each cohort but you know what happens when you assume right?

Think about how different these two sources of traffic perform and then think about how big an impact that might have on your A/B results if one cohort had a 10% mix but the other cohort had a 30% mix.

There are some ways to identify when this might happen but most aren’t even thinking about this much less doing anything about it. Many of those fact-based decisions are based on what amounts to a lie.

Revenue Optimization

This isn’t just about eCommerce sites either. If you’re an advertising based site you’re looking for page views, right?

Image Search Traffic Publishers View

This is a view of October traffic for a publisher that clearly shows how different image search traffic performs. Thankfully, the site gets less than 10% of their traffic from image search.

Image Search Share for Publisher

Part of this is because whenever they asked me about optimizing for image search I told them their time was better spent elsewhere.

Pinterest for Publishers

Far better to invest in getting more traffic from a source, like Pinterest, that better matches intent and therefore supports the advertising business.

Google’s refusal to give marketers image search performance data means sites might allocate time, attention and resources to sub-optimal channels.

Pinterest

Elephant with Pinterest Logo

The elephant in the room is Pinterest. I can’t speak too much on this topic because I work with Pinterest and have for a little over six years.

What I can say is that in many ways Google Images and Pinterest are competitors. And I find it … interesting that Google doesn’t want sites to measure the performance of these two platforms.

Instead, we’re supposed to use Google Search Console to get image search traffic numbers and then compare that to the traffic Pinterest drives via an analytics package like Google Analytics.

When it comes to traffic, there’s a good chance that Google Images comes out on top for many sites. But that’s not the right way to evaluate these two sources of traffic. How do those two sources of traffic perform? How do they both help the business.

Why Google? Why?

Rick Sanchez

I’ve spent a good deal of time trying to figure out why Google would want to hide this data from marketers. I try hard to adhere by Hanlon’s Razor.

“Never attribute to malice that which can be adequately explained by stupidity.”

But it’s hard for me to think Google is this stupid or incompetent. Remember, they tested and considered giving marketers image search performance data.

Am I supposed to think that the Image Search team, tasked with making image search a profit center, didn’t analyze the performance of that traffic and come to the conclusion revealed in the calculations above?

I’m open to other explanations. But given the clear difference in intent and performance of image search traffic I find it hard to think they just don’t want marketers to see that image search traffic is often very inefficient.

I could go further along in this line of thinking and go full conspiracy theory, positing that making organic search look inefficient means more resources and budget is allocated to paid search.

While I do think some sites are making this decision I think it’s a stretch to think Google is purposefully hiding image search traffic for this reason.

Is Image Search Useless?

Please Close Gate

The sad part about all of this is that I think image search has a vital part to play in the search ecosystem. I believe it most often represents top of funnel queries. Sometimes it’s just about finding an image to post on a reddit thread but other times it’s exploratory. And either way I don’t mind the brand exposure.

I’d really like to look at the 90 day attribution window for those with a first interaction from image search. Do they come back through another channel later and convert? That might change the priority for image search optimization.

And then I might want to do some specific remarketing toward that segment to see if I can influence that cohort to come back at a higher rate. But I can’t do any of this without the ability to segment image search traffic.

Homework

Homework

If you’re made it this far I’d really like you to do this math for your site. Here’s a crib sheet for how to perform this analysis.

Take a month of organic search data from Google Analytics.

Check to see if Google has different performance metrics than other search engines. That’s a strong clue the mix of traffic could be causing an issue.

Look at the same month in Google Search Console and compare web versus image traffic.

Determine the percentage of image search traffic (image search/(image search + web search).

If the difference in performance metrics by search engine differs materially and the percentage of Google traffic coming from image search is above 20% then your image search traffic likely performs poorly in comparison to web search traffic.

Do the math.

Here’s where it gets tricky. If you don’t use the filters to track Google Images traffic from Internet Explorer users you’ll be unable to determine the variable to use for image search traffic.

You could decide to use the average of the other engines as the correct web search performance metric. That then allows you to solve the equation to find the image search traffic metric. But that’s a bit deterministic.

Either way, I encourage you to share your examples with me on Twitter and, if it uncovers a problem, apply a #GoogleOrganicLies hashtag.

TL;DR

The decision to hide image search performance may cause sites to allocate resources incorrectly and even make bad decisions about product and design. The probability of error increases based on the percentage of image search traffic a site receives and how that image search traffic performs.

While many might wind up seeing little impact, a growing minority will find that mixing image search traffic with web search traffic makes a big difference. I encourage you to do the math and find out whether you’ve got a problem. (This feels oddly like a ‘get tested’ health message.)

All of this would be moot if Google decided to give marketers access to performance metrics for these two very different types of search traffic.

Roundup Posts

February 26 2015 // Marketing + Rant // 37 Comments

I’m increasingly conflicted about roundup posts. You know, the kind where 23 experts answer one burning question and their answers are all put together in one long blog post. Instant content! I don’t produce roundup posts, rarely read them and infrequently contribute to them.

Roundup Dynamics

Silence of the Lambs Quid Pro Quo

The dynamics of a roundup post are pretty clear. The person aggregating the answers gets what is essentially free content for their site. Yes, I know you had to email people and potentially format the responses but the level of effort isn’t particularly high.

In exchange, the person providing the answers gets more exposure and gains some authority by being labeled an expert. Even better if your name is associated with other luminaries in the field. It’s an interesting and insidious form of social proof.

Flattery Will Get You Everywhere

Leo DiCaprio You're Awesome

It feels good to be asked to participate in roundup posts. At least at first. You’ve been selected as an expert. Talk about an ego boost!

The beauty of it is that there will always be people who want that recognition. So even if some tire of participating there is a deep reservoir of ego out there ready to be tapped. No matter what I think or write I’m certain we’ll continue to see roundup posts.

I still prefer individual opinion and thought pieces. I like when people step out on the ledge and take a stand one way or the other. Even if I disagree with you, I recognize the effort invested and bravery displayed.

Saturation Marketing Works

Times Square Advertising

I’m a marketer with an advertising background. I know saturation marketing works. So participating in roundup posts seems like a smart strategy. People see your name frequently and you’re always being portrayed in a positive light.

No matter where people turn they’re running into your name and face and you’re being hailed as an expert. Whoo-hoo! What’s wrong with that?

What’s The Frequency Kenneth?

How good is the content in these roundup posts? How much effort are these experts expending? I’m sure some spend a good deal of time on their contribution, if for no other reason than the desire to have the most insightful, provocative or humorous entry. I can’t be alone in thinking this way.

But at some point, as the number of requests rises (and they will since success begets success), you may realize that it’s just about the contribution. Showing up is 90% of the game. It’s not that the responses are bad, but they’re more like off-the-cuff answers than well thought out responses.

Remember Sammy Jankis

Memento Tattoo

Of course, I’m always thinking about how these contributions are being remembered. In a large roundup post is my name and contribution going to be remembered? I somehow doubt it. At least not the specifics.

So the only thing I really gain is installing (yes I do think of the brain like software) the idea of expertise and authority in a larger group of people. Because if you see my name enough times you’ll make those connections.

That’s powerful. No doubt about it.

Why So Serious?

Heath Ledger Joker

I ask myself why I bristle at roundup posts. Why am I increasingly reticent to contribute given my understanding of the marketing value? Am I somehow sabotaging my own success?

All too often I feel like roundup posts don’t deliver enough value to users. The content is uneven and often repetitive from expert to expert, exacerbating scanning behavior. It’s content that makes me go ‘meh’.

I might be dead wrong and could be committing the cardinal sin of marketing by relying on myself as the target market. Yet I don’t think I’m alone. I’ve spoken to others who skip these posts or, worse, have a dim view of those contributing.

Bud Light or Ruination IPA

Beer vs Beer

The top selling beer in the US last year was Bud Light. For many, achieving Bud Light status is the pinnacle of success. The thing is … I don’t want to be Bud Light. Or more to the point, I don’t provide services that match the Bud Light audience.

Lets see if I can express this next part without sounding like a douchebag.

I don’t run a large agency. I’m not in the volume business. Many of my clients are dubious of the public discourse taking place on digital marketing. They rely on their professional networks to connect them to someone who can make sense of it all and sort fact from fiction. Because, and here’s the hard truth, they don’t really believe all those people are experts.

My clients are those who crave a deliciously bitter Ruination IPA. And the way to find and appeal to those people is different. Budweiser spent gobs on Super Bowl advertising. Stone Brewing? Not so much.

So, I’m left thinking about the true meaning of authority and expertise. It’s subjective. Obviously a lot of people dig Bud Light. That’s cool. But that’s not my audience. I’m seeking authority from a different audience.

Roundup Posts

Roundup Posts

I’ll still participate in roundup posts from time to time, though I may have just shot myself in the foot with this piece. I’m inclined to contribute to posts that cover a topic I might not normally write about or to site that has a different audience.

My goal is to ensure I maintain some visibility, without going overboard, while securing authority with new audiences that match my business goals. Your business goals might be different, so contributing to lots and lots of roundup posts might be right up your alley.

TL;DR

There’s nothing inherently wrong with roundup posts as a part of your content marketing strategy. But you should understand whether this tactic reaches your target market and aligns with your business goals.

Google Removes Related Searches

April 19 2013 // Rant + SEO // 45 Comments

This morning I went to use one of my go to techniques for keyword research and found it was … missing.

Related Searches Gone

Related Searches Option Gone

It was bad enough that the new Search tools interface was this awkward double-click menu but I understood that decision. Because most mainstream users don’t ever refine their results.

But to remove related searches from that menu altogether? In less than a year related searches went from being a search tip to being shuffled off to Buffalo?

WTF!

Out of Insight

Clooney is Pissed

Google needs to understand that there are SEOs, or digital marketing professionals if that makes it easier, who are helping to make search results better. We’re helping sites understand the syntax and intent of their users and creating relevant and valuable experiences to match and satisfy those queries.

I wasn’t happy but wasn’t that upset when Google introduced (not provided). But as the amount of (not provided) traffic increases I see no reason why Google shouldn’t implement my (not provided) drill down suggestion. Seriously, get on that.

But then Google merged Google Trends with Google Insights for Search and in the process removed its most useful feature. That’s right, knowing what percentage of the traffic that was attributed to each category let SEOs better understand the intent of that query.

Now Google’s taking away the interface for related searches? Yeah, you’ve gone too far now. Hulk mad.

Stop Ignoring Influencers

You Wouldn't Like Me When I'm Angry

Just like the decision to terminate Google Reader, Google doesn’t seem to understand that they need to address influencers. And believe it or not Google, SEOs are influencers. We’re demystifying search so that sites don’t fall for get-rank-quick schemes. And you need us to do that because you’re dreadful at SEO. Sites aren’t finding much of your educational content. They’re not. Really.

In the last year Google’s made it more and more difficult for SEOs to do good work. And you know who ultimately suffers? Google. Because the content coming out won’t match the right syntax and intent. It’ll get tougher for Google, over-time, to find the ‘right’ content and users will feel the slow decline in search quality. You know, garbage in, garbage out.

Any good marketer understands that they have to serve more than one customer segment. Don’t like to think of SEOs as influencers? Fine. Call us power users and put us back on your radar and stop removing value from the search ecosystem.

No Such Thing As A Good Scraper

March 14 2012 // Rant + SEO // 24 Comments

I have 155 pending comments right now. The overwhelming majority of them are pingbacks from benign scrapers. Some may see this as a boon but I view these scrapers as arterial plaque that could ultimately give the Internet a heart attack.

Here’s my personal diagnosis.

The Illness

My definition of a benign scraper is a site that scrapes content but provides attribution. I’ve gotten a ton of these recently because of links I received in high profile sites within the search community. Those sites are the target of these scrapers so my link gets carried along as part of the deal.

Benign Scraper Pingbacks

The attitude by most is that the practice won’t damage the scraped site and may actually provide a benefit through the additional links. Heck, Jon Cooper at Point Blank SEO even came up with a clever way to track the scrape rate of a site as a way to determine which sites might be the best candidates for guest posts.

Signs and Symptoms

But what do these scraper sites look like? Some of these scrapers might have original content mixed in with the scraped content but in reviewing my pingbacks this seems like the exception and not the rule. Most of these benign scrapers are just pulling in content from a number of feeds and stuffing it onto the page hoping that users show up and click on ads and that the content owners don’t take exception.

Benign Scraper Attribution Example

“Hey, I gave you a link, so we’re cool, right bro?”

No bro, we’re not cool.

This stuff is garbage. It’s content pollution. It is the arterial plaque of the Internet.

The Doctor

Google is trying to keep up and often removes this dreck from the index.

Benign Scraper Deindexed

But for every one that Google removes there’s another that persists.

Benign Scraper Indexed

How long until the build up of this arterial plaque gives the Internet a heart attack? One day we’ll wake up and the garbage will be piled high like a horrifying episode of Hoarders.

Support Groups?

The industry attitude toward these scrapers is essentially a tacit endorsement. It brings to mind the quote attributed to Edmund Burke.

All that is necessary for the triumph of evil is that good men do nothing.

We turn a blind eye and whistle past the graveyard happily trusting that Google will sort it all out. They’ll make sure that the original content is returned instead of the scraped content. That’s a lot of faith to put in Google, particularly as they struggle to keep up with the increasing pace of digital content.

Are we really this desperate for links?

Desperate for Links Example

Yet, we whine about how SEO is viewed by those outside of the industry. And we’ll whine again when Google gets a search result wrong and shows a scraper above the original content. Indignant blog posts will be written.

Treatment

Even if we wanted to, we have few tools at our disposal to tell Google about these sites. The tools we do have are onerous and inefficient.

It doesn’t have to be that way.

Why not build a Chrome extension that lets me flag and report scraper sites? Or a WordPress Plugin that lets me mark and report a site as a scraper directly within the comment interface. Or how about a section in Google Webmaster Tools where I can review links?

Sure, there are reporting issues and biases but those are solvable problems. Thing is, many doctors have a God complex. Google may not think we’re able to contribute to the diagnosis. That would be a mistake.

Cure?

Disaster Girl Dares You To Ignore Scrapers

Maybe we don’t want to be cured. Perhaps we’re all willing to let this junk persist, willing to smile as your mom finds one of these sites when she’s looking for that article you wrote. Willing to believe that your brand is totally safe when it appears on these sites. But the rest of the world isn’t nearly as savvy as you think.

I know many of these links work, but they shouldn’t. The fact that they do worries me. Because, over time, people might not be able to tell the difference and that’s not the Internet I want.

Today these scrapers are benign but tomorrow they could turn malignant.

Delicious Turns Sour

December 19 2011 // Rant + Technology + Web Design // 8 Comments

In April, the Internet breathed a sigh of relief when Delicious was sold to AVOS instead of being shut down by Yahoo. In spite of Yahoo’s years of neglect, Delicious maintained a powerful place in the Internet ecosystem and remained a popular service.

Users were eager to see Delicious improve under new management. Unfortunately the direction and actions taken by Delicious over the last 8 months make me pine for the days when it was the toy thrown in the corner by Yahoo!

Where Did Delicious Go Wrong?

Delicious Dilapidated Icon

I know new management means well and have likely poured a lot of time and effort into this enterprise. But I see problems in strategy, tactics and execution that have completely undermined user trust and loyalty.

Bookmarklets

The one mission critical feature which fuels the entire enterprise falls into disrepair. Seriously? This is unacceptable. The bookmarklets that allow users to bookmark and tag links were broken for long stretches of time and continue to be rickety and unreliable. This lack of support is akin to disrespect of Delicious users.

Stacks

Here’s how they work. Select some related links, plug them into a stack and watch the magic happen. You can customize your stack by choosing images to feature, and by adding a title, description and comment for each link. Then publish the stack to share it with the world. If you come across another stack you like, follow it to easily find it again and catch any updates.

Instead of the nearly frictionless interaction we’ve grown accustomed to, we’re now asked to perform additional and duplicative work. I’ve already created ‘stacks’ by bookmarking links with appropriate tags. Want to see a stack of links about SEO, look at my bookmarks that are tagged SEO. It doesn’t get much more simple than that.

Not only have they introduced complexity into a simple process, they’ve perverted the reason for bookmarking links. The beauty of Delicious was that you were ‘curating’ without trying. You simply saved links by tags and then one day you figured out that you had a deep reservoir of knowledge on a number of topics.

Stacks does the opposite and invites you to think about curation. I’d argue this creates substantial bias, invites spam and is more aligned with the dreck produced by Squidoo.

Here’s another sign that you’ve introduced unneeded complexity into a product.

Delicious Describes Stacks

In just one sentence they reference stacks, links, playlists and topics. They haven’t even mentioned tags! Am I creating stacks or playlists? If I’m a complete novice do I understand what ‘stack links’ even means?

Even if I do understand this, why do I want to do extra work that Delicious should be doing for me?

Design

Design over Substance

The visual makeover doesn’t add anything to the platform. Do pretty pictures and flashy interactions really help me discover content? Were Delicious users saying they would use the service more if only it looked prettier? I can’t believe that’s true. Delicious had the same UI for years and yet continued to be a popular service.

Delicious is a utilitarian product. It’s about saving, retrieving and finding information.

Sure, Flipboard is really cool but just because a current design pattern is in vogue doesn’t mean it should be applied to every site.

UX

There are a number of UX issues that bother me but I’ll highlight the three that have produced the most ire. The drop down is poorly aligned causing unnecessary frustration.

Delicious Dropdown Alignment

More than a few times I’ve gone across to to click on one of the drop down links only to have it disappear before I could finish the interaction.

The iconography is non-intuitive and doesn’t even have appropriate hover text to describe the action.

Delicious Gray Icons

Delicious Icons are Confusing

Does the + sign mean bookmark that link? What’s the arrow? Is that a pencil?

Now, I actually get the iconography. But that’s the problem! I’m an Internet savvy user, yet the new design seems targeted at a more mainstream user. Imagine if Pinterest didn’t have the word ‘repin’ next to their double thumbtack icon?

Finally, the current bookmarklet supports the tag complete function. You begin typing in a tag and you can simply select from a list of prior tags. This is a great timesaver. It even creates a handy space at the end so you can start your next tag. Or does it?

Delicious Tag Problems

WTF!? Why is my tag all muddled together?

Delicious improved tagging by allowing spaces in tags. That means that all tags have to be separated by commas. I get that. It’s not the worst idea either. But the tag complete feature should support this new structure. Because it looks like it functions correctly by inserting a space after the tag. I mean, am I supposed to use the tag complete feature and then actually backspace and add a comma?

It’s not the best idea to make your users feel stupid.

Uptime

Delicious Unavailable Page

The service has been unstable, lately as poor as it was at the height of Twitter’s fail whale problem. I’ve seen that empty loft way too much.

What Should Delicious Do Instead?

It’s easy to bitch but what could Delicious have done instead? Here’s what I think they should have (and still could) do.

Filtering

An easy first step to improve Delicious would be to provide a better way to filter bookmarks. The only real way to do so right now is by adding additional tags. It would have been easy to introduce time (date) and popularity (number of times bookmarked) facets.

They could have gone an extra step and offered the ability to group bookmarks by source. This would let me see the number of bookmarks I have by site by tag. How many times have I bookmarked a Search Engine Land article about SEO? Not only would this be interesting, it maps to how we think and remember. You’ll hear people say something like: “It was that piece on management I read on Harvard Business Review.”

There are a tremendous number of ways that the new team could have simply enhanced the current functionality to deliver added value to users.

Recommendations

Recommendation LOLcat

Delicious could create recommendations based on current bookmark behavior and tag interest. The data is there. It just needs to be unlocked.

It would be relatively straightforward to create a ‘people who bookmarked this also bookmarked’ feature. Even better if it only displayed those I haven’t already bookmarked. That’s content discovery.

This could be extended to natural browse by tag behavior. A list of popular bookmarks with that tag but not in my bookmarks would be pretty handy.

Delicious could also alert you when it saw a new bookmark from a popular tag within your bookmarks. This would give me a quick way to see what was ‘hot’ for topics I cared about.

Recommendations would put Delicious in competition with services like Summify, KnowAboutIt, XYDO and Percolate. It’s a crowded space but Delicious is sitting on a huge advantage with the massive amount of data at their disposal.

Automated Stacks

Instead of introducing unnecessary friction Delicious could create stacks algorthmically using tags. This could be personal (your own curated topics) or across the entire platform. Again, why Delicious is asking me to do something that they can and should do is a mystery to me.

Also, the argument that people could select from multiple tags to create more robust stacks doesn’t hold much water. Delicious knows which tags appear together most often and on what bookmarks. Automated stacks could pull from multiple tags.

The algorithm that creates these stacks would also constantly evolve. They would be dynamic and not prone to decay. New bookmarks would be added and bookmarks that weren’t useful (based on age, lack of clicks or additional bookmarks) would be dropped.

Delicious already solved the difficult human element of curation. It just never applied appropriate algorithms to harness that incredible asset.

Social Graph Data

Delicious could help order bookmarks and augment recommendations by adding social graph data. The easiest thing to do would be to determine the number of Likes, Tweets and +1s each bookmark received. This might simply mirror bookmark popularity though. So you would next look at who saved the bookmarks and map their social profiles to determine authority and influence. Now you could order bookmarks that were saved by thought leaders in any vertical.

A step further, Delicious could look at the comments on a bookmarked piece of content. This could be used as a signal in itself based on the number of comments, could be mined to determine sentiment or could provide another vector for social data.

Trunk.ly was closing in on this since they already aggregated links via social profiles. Give them your Twitter account and they collect and save what you Tweet. This frictionless mechanism had some drawbacks but it showed a lot of promise. Unfortunately Trunk.ly was recently purchased by Delicious. Maybe some of the promise will show up on Delicious but the philosophy behind stacks seems to be in direct conflict with how Trunk.ly functioned.

Analytics

Delicious could have provided analytics to individuals as to the number of times their bookmarks were viewed, clicked or re-bookmarked. The latter two metrics could also be used to construct an internal influence metric. If I bookmark something because I saw your bookmark, that’s essentially on par with a retweet.

For businesses, Delicious could aggregate all the bookmarks for that domain (or domains), providing statistics on the most bookmarked pieces as well as when they are viewed and clicked. A notification service when your content is bookmarked would also be low-hanging fruit.

Search

Delicious already has search and many use it extensively to find hidden gems from both the past and present. But search could be made far better. In the end Delicious could have made a play for being the largest and best curated search engine. I might be biased because of my interest in search but this just seems like a no-brainer.

Revenue

Building a PPC platform seems like a good fit if you decide to make search a primary feature of the site. It could even work (to a lesser extent) if you don’t feature search. Advertisers could pay per keyword search or tag search. I doubt this would disrupt user behavior since users are used to this design pattern thanks to Google.

Delicious could even implement something similar to StumbleUpon, allowing advertisers to buy ‘bookmark recommendations’. This type of targeted exposure would be highly valuable (to users and advertisers) and the number of bookmarks could provide long-term traffic and benefits. Success might be measured in a new bookmarks per impression metric.

TL;DR

The new Delicious is a step backward, abandoning simplicity and neglecting mechanisms that build replenishing value. Instead management has introduced complexity and friction while concentrating on cosmetics. The end result is far worse than the neglect Delicious suffered at the hands of Yahoo.

The Knuckleball Problem

December 08 2011 // Marketing + Rant + Web Design // 4 Comments

The knuckleball is a very effective pitch if you can throw it well. But not many do. Why am I talking about arcane baseball pitches? Because the Internet has a knuckleball problem.

Knuckleball

Image from The Complete Pitcher

The Knuckleball Problem

I define the knuckleball problem as something that can be highly effective but is also extremely difficult. The problem arises when people forget about the latter (difficulty) and focus solely on the former (potential positive outcome).

Individuals, teams and organizations embark on a knuckleball project with naive enthusiasm. They’re then baffled when it isn’t a rousing success. In baseball terms that means instead of freezing the hitter, chalking up strikeouts and producing wins you’re tossing the ball in the dirt, issuing walks and running up your ERA.

If a pitcher can’t throw the knuckleball effectively, they don’t throw the knuckleball. But in business, the refrain I hear is ‘X isn’t the problem, it’s how X was implemented‘.

This might be true, but the hidden meaning behind this turn of phrase is the idea that you should always attempt to throw a knuckleball. In reality you should probably figure out what two or three pitches you can throw to achieve success.

Difficulty and Success

The vast majority of pitchers do not throw the knuckleball because it’s tough to throw and produces a very low success rate. Most people ‘implement’ or ‘execute’ the pitch incorrectly. Instead pitchers find a mix of pitches that are less difficult and work to perfect them.

Yet online, a tremendous number of people try to throw knuckleballs. They’re trying something with a high level of difficulty instead of finding less difficult (perhaps less sexy or trendy) solutions. And there is a phalanx of consultants and bloggers who seem to encourage and cheer this self-destructive behavior.

Knuckleballs

In general I think mega menus suck. Of course there are exceptions but they are few and far between. The mega menu is a knuckleball. Sure you can attempt it, but the odds are you’re going to screw it up. And there are plenty of other ways you can implement navigation that will be as or even more successful.

When something has such a high level of difficulty you can’t just point to implementation and execution as the problem. When a UX pattern is widely misapplied is it really that good of a UX pattern?

Personas also seem to be all the rage right now. Done the right way personas can sometimes deliver insight and guidance to a marketing team. But all too often the personas are not rooted in real customer experiences and devolve into stereotypes that are then used as weapons in cross-functional arguments meetings. “I’m sorry, but I just don’t think this feature speaks to Concerned Carl.”

Of course implementation and execution matter. But when you consistently see people implementing and executing something incorrectly you have to wonder whether you should be recommending it in the first place.

Pitching coaches aren’t pushing the knuckleball on their pitching staffs.

Can You Throw a Knuckleball?

Cat Eats Toy Baseball Players

The problem is most people think they can throw the online equivalent of the knuckleball. And unlike the baseball diamond the feedback mechanism online is far from direct.

Personas are created and used to inform your marketing strategy and there is some initial enthusiasm and some minor changes but over time people get tired of hearing about these people and the whole thing peters out along with the high consulting fees which are also conveniently forgotten.

The hard truth is most people can’t throw the knuckleball. And that’s okay. You can still be a Cy Young Award winner. Tim Lincecum does not throw a knuckleball.

How (and When) To Throw The Knuckleball

This doesn’t mean you shouldn’t be taking risks or attempt to throw a knuckleball once in a while. Not at all.

However, you shouldn’t attempt the knuckler simply because it is difficult or ‘more elegant’ or the hottest new fad. You can take plenty of risks throwing the slider or curve or change up, all pitches which have a higher chance of success. In business terms the risk to reward ratio is far more attractive.

If you’re going to start a knuckleball project you need to be clear about whether you have a team that can pull it off. Do you really have a team of A players or do you have a few utility guys on the team?

Once you clear that bit of soul searching you need to be honest about measuring success. A certain amount of intellectual honesty is necessary so that you can turn to the team and say, you tossed that one in the dirt. Finally, you need a manager who’s willing to walk to the mound and tell the pitcher to stop futzing with the knuckleball and start throwing some heat.

TL;DR

The Internet has a knuckleball problem. Too many are attempting the difficult without understanding the high probability of failure while ignoring the less difficult that could lead to success.

Not Provided Keyword Not A Problem

November 21 2011 // Analytics + Rant + SEO // 16 Comments

Do I think Google’s policy around encrypting searches (except for paid clicks) for logged-in users is fair? No.

Fair Is Where You Get Cotton Candy

But whining about it seems unproductive, particularly since the impact of (not provided) isn’t catastrophic. That’s right, the sky is not falling. Here’s why.

(Not Provided) Keyword

By now I’m sure you’ve seen the Google Analytics line graph that shows the rise of (not provided) traffic.

Not Provided Keyword Google Analytics Graph

Sure enough, 17% of all organic Google traffic on this blog is now (not provided). That’s high in comparison to what I see among my client base but makes sense given the audience of this blog.

Like many others (not provided) is also my top keyword by a wide margin. I think seeing this scares people but it makes perfect sense. What other keyword is going to show up under every URL?

Instead of staring at that big aggregate number you have to look at the impact (not provided) is having on a URL by URL basis.

Landing Page by Keywords

To look at the impact of (not provided) for a specific URL you need to view your Google organic traffic by Landing Page. Then drill down on a specific URL and use Keyword as your secondary dimension. Here’s a sample landing page by keywords report for my bounce rate vs exit rate post.

Landing Page by Keyword Report with Not Provided

In this example, a full 39% of the traffic is (not provided). But a look at the remaining 61% makes it pretty clear what keywords bring traffic to this page. In fact, there are 68 total keywords in this time frame.

Keyword Clustering Example

Clustering these long-tail keywords can provide you with the added insight necessary to be confident in your optimization strategy.

(Not Provided) Keyword Distribution

The distribution of keywords outside of (not provided) gives us insight into the keyword composition of (not provided). In other words, the keywords we do see tell us about the keywords we don’t.

Do we really think that the keywords that make up (not provided) are going to be that different from the ones we do see? It’s highly improbable that a query like ‘moonraker steel teeth’ is driving traffic under (not provided) in my example above.

If you want to take things a step further you can apply the distribution of the clustered keywords against the pool of (not provided) traffic. First you reduce the denominator by subtracting the (not provided) traffic from the total. In this instance that’s 208 – 88 which is 120.

Even without any clustering you can take the first keyword (bounce rate vs. exit rate) and determine that it comprises 20% of the remaining traffic (24/120). You can then apply that 20% to the (not provided) traffic (88) and conclude that approximately 18 visits to (not provided) are comprised of that specific keyword.

Is this perfectly accurate? No. Is it good enough? Yes. Keyword clustering will further reduce the variance you might see by specific keyword.

Performance of (Not Provided) Keywords

The assumption I’m making here is that the keyword behavior of those logged-in to Google doesn’t differ dramatically from those who are not logged-in. I’m not saying there might not be some difference but I don’t see the difference being large enough to be material.

If you have an established URL with a history of getting a steady stream of traffic you can go back and compare the performance before and after (not provided) was introduced. I’ve done this a number of times (across client installations) and continue to find little to no difference when using the distribution method above.

Even without this analysis it comes down to whether you believe that query intent changes based on whether a person is logged-in or not? Given that many users probably don’t even know they’re logged-in, I’ll take no for 800 Alex.

What’s even more interesting is that this is information we didn’t have previously. If by chance all of your conversions only happen from those logged-in, how would you have made that determination prior to (not provided) being introduced? Yeah … you couldn’t.

While Google has made the keyword private they’ve actually broadcast usage information.

(Not Provided) Solutions

Keep Calm and SEO On

Don’t get me wrong. I’m not happy about the missing data, nor the double standard between paid and organic clicks. Google has a decent privacy model through their Ads Preferences Manager. They could adopt the same process here and allow users to opt-out instead of the blanket opt-in currently in place.

Barring that, I’d like to know how many keywords are included in the (not provided) traffic in a given time period. Even better would be a drill-down feature with traffic against a set of anonymized keywords.

Google Analytics Not Provided Keyword Drill Down

However, I’m not counting on these things coming to fruition so it’s my job to figure out how to do keyword research and optimization given the new normal. As I’ve shown, you can continue to use Google Analytics, particularly if you cluster keywords appropriately.

Of course you should be using other tools to determine user syntax, identify keyword modifiers and define query intent. When keyword performance is truly in doubt you can even resort to running a quick AdWords campaign. While this might irk you and elicit tin foil hat theories you should probably be doing a bit of this anyway.

TL;DR

Google’s (not provided) policy might not be fair but is far from the end of the world. Whining about (not provided) isn’t going to change anything. Figuring out how to overcome this obstacle is your job and how you’ll distance yourself from the competition.

Worst SEO Title Ever

September 20 2011 // Rant + SEO // 16 Comments

Do as I say, not as I do. That seems to be Google’s philosophy when it comes to blog optimization.

Worst SEO Title Ever

Worst SEO Title Ever

What has finally pushed me over the edge into rant mode? It’s today’s Google+ announcement.

Bad Google+ Blog Post Title

A bunch of numbers for your title. Really? Instead maybe you’d, you know, want to mention the introduction of search or that Google+ was now open to everyone. Those are actually really interesting and noteworthy items.

This isn’t a John Barth novel. The meta information around the number of improvements isn’t really relevant. Really, it’s not.

What query intent are you trying to match here? And yes, that matters.

Snippet Optimization

Google also continues to fail on snippet optimization. Yes, we know that the meta description isn’t a ranking factor. But the description is more important today since it’s used in the transmission of information to other platforms. So what does the snippet for this post look like?

Bad Google+ Snippet

At a glance can you tell what this is about? I certainly can’t. The default image here is useless, the title is nonsense and the description simply tells me that it’s available in other languages. Google can count to 100, seemingly in different languages. Congratulations.

Best Practices and Role Models

Does everyone have to follow best practices? No. All of this is optional. But Google is in a position where they should be setting an example. Google might want to take the Charles Barkley approach, but like it or not, you are a role model.

Or perhaps this is a deliberate thumb in the eye to the SEO community? We know that Google is willing to change titles when they think they’re not quite right. So maybe they just don’t think any of this is necessary? But I doubt that’s the case. Remember the adage about malice.

So please Google, take the time to perform the minimum of optimization on your vast collection of blogs (or give me and my team a call and we’ll get you square.) It’s good for you and it’s good for the search community.

[Update] Well, it looks like the Google Mobile Blog wants to fight for the Worst SEO Title Ever crown with their own numbers post.

 

PageRank Ponzi

September 09 2011 // Rant + SEO // 12 Comments

Why are you still submitting your site and articles to directories? Sure, there was a time when directories were valuable. But that time has passed. So stop feeding their business and build your own instead.

Totally Flabbergasted LOLcat

Page Rank Ponzi

Directories are essentially a form of PageRank ponzi. They use your content to build their business – to build their trust and authority – and, in exchange, lease a small fraction of that trust and authority (e.g. PageRank) back to you.

You either give away or actually pay to provide them with content. They take your assets, gladly, and use it to do what you should be doing. Even if you get a small benefit from this exchange, you’re getting the short end of the stick.

Directory Heyday

There was a time when directories were useful and valuable. From the mid-to-late 90s to around 2003, directories were used by many to find sites and content. This was before tabbed browsing and broadband connections made it easy to get from one site to another. This was before search became the dominant way to navigate the web. This was before social platforms allowed you to tap your social graph and crowdsource information.

One only needs to look at the search volume for the term ‘web directory’ to see that this is an outdated method of online discovery.

Search Trend for Web Directory Searches

Distribution

In the directory heyday it may have been difficult to get your site, article or blog post distributed. The web was not nearly as connected or fluid.

But today we have blogging platforms, a robust social graph and numerous social media outlets that give you an opportunity to capitalize on your own intellectual property instead of giving it away to others for peanuts.

We Are The Directory

Whether you call it curation or crowdsourcing there are other repositories that mimic and exceed the traditional directory. You might search Delicious. In fact, more people should. Or you might try out Trunk.ly.

We’re doing the work of directories every day.

Caffeine

In June of 2010, Google launched Caffeine and increased their ability to crawl and index the web. This was one of the last pieces of the puzzle in making directories obsolete.

Previously, directories might have been able to quickly surface new sites or content that hadn’t yet been found by Google. But that’s just not the case today. Google finds new content even in the dark and dusty corners of the Internet where Geocities pages lurk and survive.

Google Directory

So what does Google think about directories today?

Google Directory No Longer Available Message

Google shut down their directory. Read that again and think about what it means for the future and value of directories. And don’t get me started on the utter collapse of DMOZ. (No, I’m not even going to link there.)

As an aside, Google may want to consider a folder level URL removal so directory results (which return a 404) don’t clutter up SERPs.

Directory Spam

Most web directories are hastily thrown together arbitrage sites that serve as outposts for spam. Here’s a excerpt from an email sent to me by an ‘SEO Consultant’.

Directory Spam

This is not SEO, at least not the SEO I practice. Some may reject this carpet bombing approach but subscribe to the idea that a handful of paid directories are worthwhile.

I say save your money.

Paid Link or Paid Listing?

Jack McCoy from Law & Order

Frankly, I’m still a bit irked that Google doesn’t view a paid listing as a paid link. The argument for paid directories is that they provide a certain level of curation that makes them valuable. You’re paying for someone to curate that directory – not for the link. This seems a very thin argument at best, and a bunch of claptrap at worst. Most, if not all, directories are pretty much a free-for-all as long as what you’re submitting isn’t complete spam or off topic. The level of curation is marginal, and I’m being nice.

Not only that, but it comes down to intent. For some reason I hear Jack McCoy yelling ‘intent follows the bullet’. It’s not a perfect analogy, but the general idea is that intent matters. Today, the intent for a directory listing is, quite simply, to secure a back link. So, what exactly is the difference between a paid link and a paid listing? There is none as far as I can tell.

Link Value

REM Out of Time Cover Art

How valuable is that directory link anyway? I’m telling you that the value of these links declines every day. People aren’t using these sites. Newer technologies have replaced directories in the information ecosystem. The closure of the Google Directory should be a wake up call to anyone still clinging to this practice.

TL;DR

Traditional directories are an obsolete method of information discovery. Even if they provide some small benefit today, you’re paying a hefty price to support someone else’s dying business model. Stop PageRank ponzi and invest in the future and yourself instead.

xxx-bondage.com