2010 Internet, SEO and Technology Predictions

January 03 2010 // Advertising + Marketing + SEO + Social Media + Technology // 5 Comments

As we begin 2010, it’s time for me to go on the record with some predictions. A review of my 2009 predictions shows a few hits, a couple of half-credits and a few more misses. Then again, many of my predictions were pretty bold.

2010 Technology Predictions

This year is no different.

The Link Bubble Pops

At some point in 2010, the link bubble will pop. Google will be forced to address rising link abuse and neutralize billions of links. This will be the largest change in the Google algorithm in many years, disrupting individual SEO strategies as well as larger link based models such as Demand Media.

Twitter Finds a Revenue Model

As 2010 wears on Twitter will find and announce a revenue model. I don’t know what it will be and I’m unsure it will work, but I can’t see Twitter waving their hands for yet another year. Time to walk the walk Twitter.

Google Search Interface Changes

We’ve already seen the search mode test that should help users navigate and refine search results. However, I suspect this is just the beginning and not the end. The rapid rate of iteration by the Google team makes me believe we could see something as radical as LazyFeed’s new UI or the New York Times Skimmer.

Behavioral Targeting Accelerates

Government and privacy groups continue to rage against behavioral targeting (BT), seeing it as some Orwellian advertising machine hell bent on destroying the world. Yet, behavioral targeting works and savvy marketers will win against these largely ineffectual groups and general consumer apathy. Ask people if they want targeted ads and they say no, show them targeted ads and they click.

Google Launches gBooks

The settlement between Google, the Authors Guild and the Association of American Publishers will (finally) be granted final approval and then the fireworks will really start. That’s right, the settlement brouhaha was the warm up act. Look for Google to launch an iTunes like store (aka gBooks) that will be the latest in the least talked about war on the Internet: Google vs. Amazon.

RSS Reader Usage Surges

What, isn’t RSS dead? Well, Marshall Kirkpatrick doesn’t seem to think so and Louis Gray doesn’t either. I’ll side with Marshall and Louis on this one. While I still believe marketing is the biggest problem surrounding RSS readers, advancements like LazyFeed and Fever make me think the product could also advance. I’m still waiting for Google to provide their reader as a while label solution for eTailers fed up with email overhead.

Transparent Traffic Measurement Arrives

Publishers and advertisers are tired of ballpark figures or trends which are directionally accurate. Between Google Analytics and Quantcast people now expect a certain level of specificity. Even comScore is transitioning to beacon based measurement. Panel based traffic measurement will recede, replaced by transparent beacon based measurement … and there was much rejoicing.

Video Turns a Profit

Online video adoption rates have soared and more and more premium content is readily available. Early adopters bemoan the influx of advertising units, trying to convince themselves and others that people won’t put up with it. But they will. Like it or not, the vast majority of people are used to this form of advertising and this is the year it pays off.

Chrome Grabs 15% of Browser Market

Depending on who you believe, Chrome has already surpassed Safari. And this was before Chrome was available for Mac. That alone isn’t going to get Chrome to 15%. But you recall the Google ‘What’s a Browser?‘ video, right? Google will disrupt browser inertia through a combination of user disorientation and brand equity. Look for increased advertising and bundling of Chrome in 2010.

Real Time Search Jumps the Shark

2009 was, in many ways, the year of real time search. It was the brand new shiny toy for the Internati. Nearly everyone I meet thinks real time search is transformational. But is it really?

A Jonathan Mendez post titled Misguided Notions: A Study of Value Creation in Real-Time Search challenges this assumption. A recent QuadsZilla post also exposes a real time search vulnerability. The limited query set and influx of spam will reduce real time search to an interesting, though still valuable, add-on. The Internati? They’ll find something else shiny.

The Link Bubble

December 28 2009 // SEO // 6 Comments

The real estate bubble popped. Will the link bubble be next?

Link Bubble

The real estate bubble was the product of greed, low interest rates, loose lending policies and derivatives. Nearly anyone could get a house and people bought into the idea that real estate would always be a good investment. The result of this irrational exuberance? Homes were valued far more then they were worth.

The Link Bubble

Are links that different than real estate?

Links have traditionally been a reliable sign of trust and authority because they were given out judiciously, a lot like mortgages. For a long time link policies were tight. You needed references and documentation before you earned that link.

In addition, links weren’t looked upon as an investment tool. The concept that links influenced SEO hadn’t taken hold. The motivation behind links was relatively pure and that meant Google and others could rely on them as an accurate signal of quality.

Links or Content?

Many have recently bemoaned the death of hand crafted content and the rise of content farms as a threat to search quality. But is content really the problem?

Content has little innate value from a search perspective. Yes, search engines glean the content topic based on the text. It’s like knowing the street address of a house. You know where it is and, probably, a bit about the neighborhood. But it doesn’t tell you about the size, style or quality of the home.

Long tail searches are akin to searching for a house by street address. So, content without links may sometimes produce results. But the vast majority of searches will require more information. That’s where links come in.

McDonald’s Content

Lets switch analogies for a moment. Some have called Demand Media the McDonald’s of content. There’s a bit of brilliance in that comparison, but not in the way most think.

Both McDonald’s and Demand Media crank out product that many would argue is mediocre. Offline, McDonald’s buys the best real estate and uses low prices, brand equity and marketing to ensure diners select them over competitors.

Online, Google holds the prime real estate. But that real estate can’t be outright purchased. And in the absence of price, we’re left with brand equity and marketing. Online, brand equity translates into trust and authority. And links are the marketing that help build and maintain that brand equity.

Demand Media has brands (their words) that give it automatic trust and authority. Publish something on eHow and it automatically inherits the domain’s trust and authority, built on over 11 million backlinks.

Writers for Demand Media are provided revenue share opportunities on their articles. Here’s one of the tips they give to writers to boost traffic to their articles.

2) Link to your article from other websites.

Link from your own website or blog, from a message board or forum, from your social networking profile on MySpace or Facebook and more. The more high quality links to your article there are on the web, the more highly a search engine will rank it.

Demand Media combines the installed brand equity of multiple sites (which happen to be cross-linked) with an incentive to contributors to generate additional links. The content doesn’t have to be great when links secure premium online real estate.

There might be something better down the road, but McDonald’s is always right there at the corner.

Link Inflation

The last few years have produced major changes surrounding links. Linkbuilding is now a common term and strategy. A number of notable SEO firms tout links as the way to achieve success.

Linkbuilding firms sprung up. Linkbulding software of various shades of gray were launched. Paid links of various flavors flourished. Social bookmarking and networking accelerated link inflation. And new business models like Demand Media sprung up to take advantage of the link economy, creating a collection of sites and implementing incentives that result in something that resembles derivatives.

Link policies went from tight to loose and people got greedy. Anyone can get links these days. So what’s the natural result of this link activity?

Link Bubble Pops

Link Recession

The value of links is inflated and at some point the system will correct. The algorithm will change to address the abuse of links. Unlike The Federal Reserve, Google probably isn’t looking for a soft landing, nor are they going to extend a bailout.

Some links will continue to matter. Links that are in the right neighborhood. The ones with tree lined streets, good schools and low crime. But will links from cookie cutter planned communities still be valuable? Strong links will mean more because they’ll hold their value, while many more links will be neutralized.

I’m no Nouriel Roubini, but I do believe that a major link correction is coming in 2010. Google must address the link bubble to make search results better.

Yahoo Strong-Arms comScore

December 22 2009 // Advertising + Technology // Comments Off on Yahoo Strong-Arms comScore

The other day I received an interesting email from Yahoo!

Yahoo and comScore beacon

It’s pretty easy to read between the lines here. In fact, little line reading is necessary. The new comScore beacon is providing more accurate results. Yahoo is not currently participating in the beacon program. Yahoo wasn’t keen on the “apples-to-oranges” comparison that “could create confusion for advertisers” because it would likely negatively impact their display business.

Don’t Forget Yahoo!

Bashing Yahoo! seems to be the cool thing to do these days, and they’ve certainly driven themselves into a ditch. But Yahoo! still holds a powerful position as a portal, content and email provider. This email seems like a not-so-gentle reminder that Yahoo! is still a 900 pound gorilla in some circles.

Beacons and Panel Data

The other takeaway here is the fact that beacons are fast becoming the best way to measure traffic. I see comScore’s introduction of beacon technology as a direct reaction to Quantcast.

The rise of Google Analytics allows more and more companies to know exactly how much traffic they receive. The result of this knowledge is a growing dissatisfaction with panel based measurements that aren’t just inaccurate but are sometimes flat out wrong.

No More Hand Waving

Whether it was the Web 1.0 darling Alexa or recent upstart Compete, panel based services continue to fail. The difference this time around is that we have beacons (like Google Analytics and Quantcast) that let us know when they fail and by how much.

So while Yahoo! has secured a 6-month repreive, the future will be in accurate and transparent traffic measurement.

Google Brand Search Results

December 15 2009 // SEO // Comments Off on Google Brand Search Results

In late October Google launched a new type of search result for brand queries. Noted on Google Blogoscoped, a brand search result takes up an enormous amount of real estate and is composed of one regular listing and two indented listings.

When are brand search results shown?

The brand result is only triggered by certain queries. Oddly, it’s not just a brand name. Instead, it’s a brand name coupled with another keyword or keywords. You can visit https://fullypromotedfranchise.com/ to know more about how to promote your brand easily.

So, a search for ‘Peet’s’ will not trigger a brand result. But a search for ‘Peet’s Major Dickason’ will.

Google Brand Result

Is it really about brands?

Upon the launch of this new result it seemed like it was solely a function of what was in the domain. A search for ‘wooden bar stools’ triggered the new result.

Google Brand Result

Here, the domain of bar-stools-barstools.com triggered the new result when it matched the ‘brand + modifier’ query. A simple domain match against the keyword query, right?

Yet, when you search on this same term today you don’t get the new brand result.

Google wooden bar stools query

So what started as a simple domain match has evolved into something more refined. This isn’t capricious in nature. Google is always looking for ways to improve search quality. In this scenario they have weighted the actual brand or company site as more authoritative for a fairly large set of ‘brand + modifier’ queries.

What’s the criteria for Google brand search results?

How Google is doing this refinement is unclear. Some have surmised Google is using some sort of brand database or that it’s related to their DNS service.

I tend to believe there was some sort of initial criteria (broad in nature) for this new brand result with a built-in refinement mechanism. The refinement would be accomplished through analysis of user behavior on these results (e.g. – relative CTR and length of clicks), SearchWiki data or human editing.

User behavior might be difficult to mine since the large amount of real estate brand results take up likely make them click magnets. And click length might describe the value and quality of the site but not whether it merits the brand result presentation.

SearchWiki data and human editing are essentially the same concept, with the former being accomplished by a decentralized group of users (the proverbial cloud) and the latter being done by a centralized group of Google employees. Human editing doesn’t seem that far-fetched to me, particularly if Google used user behavior or SearchWiki data to identify domains for review. The result would be an easy and efficient review queue.

Why is Google interested in brands?

No matter how Google is doing it, the fact that they are is a signal that brands and companies provide a new proxy for trust and authority. The first step in this direction was the Vince change which unseated many highly optimized sites from root terms and replaced them with relevant brands or companies.

The current algorithm continues to struggle with trust and authority with 0ver-optimization reducing the value of on-page factors and the rise of link pollution quickly eroding the value of off-page factors.

Google brand search results show a continuing interest in moving beyond current signals to improve search quality and deliver better results. So while the new brand search result went largely unreported, it could be a harbinger of a larger algorithmic shift in 2010.

Twitter Makes Lists … Competitive

November 01 2009 // Social Media // 8 Comments

Twitter finally got around to launching lists and immediately created a whole new competitive mania that may render them useless.

Listed

Twitter Listed Metric

By simply showing Listed as a major metric Twitter encourages comparisons. Listed will be the new Followers. We’ll see Followers to Listed ratios cranked out by the companies who traffic in these sorts of meta data measures of authority and influence.

Instead of using lists to help users manage the stream of data, they’ve turned them into a competition.

Following Twitter Lists

The ability to follow lists also creates competition. Which SEO list is best? Who’s Ruby Rock Stars list should you follow? Lists allow users to segment, but how many instances are you really going to follow? Does it help me to follow 25 instances of an SEO list? Probably not.

Suddenly a person’s lists are going to have an attached Followers metric. You could argue that the number of Followers helps define comparative quality, but that hasn’t worked for users has it? So why would it work for lists?

My motivation for creating a list isn’t to attract followers, it’s to help me turn data into information.

Twitter Lists Do Not Equal Authority or Influence

The impetus for creating a list is for the user to manage their data flow. (Or it should be!) Using a data segmentation taxonomy as a proxy to show authority just doesn’t compute. The motivation is not to grant authority to those on a list, but to simply shape data.

Lists is a grouping, but it assigns no weight to an individual within that group. We all know people who are ubiquitous but might not be well regarded. Nevertheless you’d likely put them on a topical list.

Volume is essentially what Listed measures. People with varied interests will be added to many lists. People who have played the Followers game will be added to many lists. Quantity wins, not quality.

Furthermore, once people understand that lists are the new Followers you’ll have people asking to be added to lists, trading list additions and new accounts will spring up for the sole purpose of getting users on the ‘right’ lists. It’s an obvious gaming nightmare.

Twitter Lists Don’t Define You

There’s an offhand defense of lists I’m hearing many employ: “lists show how others think of you.”

Who cares! Guess what, I stopped caring about that my sophomore year in high school. But isn’t that the DNA of Twitter? A navel-gazing popularity contest that somehow is supposed to validate value and contribution.

Thanks but no thanks.

The Real Problem with Lists

The idea behind lists seems to be around user discovery.

Twitter Lists

It’s supposed to help you find “interesting accounts.” But lists (of any kind) don’t effectively do this because people are multi-faceted.

I have a varying level of expertise and contribution in many fields. My inclusion on Danny Sullivan’s Search Marketing list is nice, but will people following that list get value from my bicycling and book related tweets?

Lists give you a complete timeline for a group of people to whom someone has assigned a certain user defined attribute. It doesn’t mean you’re going to actually get content matching that user defined attribute. This mismatch makes it difficult to find ‘interesting accounts’.

Lists are simply a blunt instrument in the transformation of data into information.

Comcast Upgrade Disrespects Customers

October 31 2009 // Rant + Technology + Web Design // 1 Comment

Saturday is bill paying day. One of those bills was Comcast. I’m signed up for automatic payments but I generally check to make sure everything is okay. I’m a bit paranoid that way and it usually only takes a few minutes with a cup of coffee steaming next to me to confirm that all is well.

Comcast was last on the list since I review my bills in reverse chronological order and the Comcast bill notification arrived in my inbox on Friday.

Comcast Fail

I clicked through on the bill and entered my user name.

comcast fail

I tried three times, paying special attention to ensure I didn’t fat finger something. Each time, same thing.

So I contacted customer support using their Live Chat feature. I was quickly connected with Vanessa who after a brief back and forth provided this explanation.

Vanessa > I wish to inform you that we did an upgrade with our system and we merged the 2 accounts which is the comcast.net and .com

Vanessa > And due to this upgrade since you do not have internet service I am afraid that you need to register it again online, AJ.

That makes as much sense as a fish riding a bicycle! To Vanessa’s credit she was apologetic (even though it wasn’t her fault) and very helpful. Thank you Vanessa.

Comcast System Upgrade?

I’m not an engineer or a coder, but I know enough to know that a database merge can be done far more elegantly. Boiled down, isn’t this a simple left outer join?

Even if there is more complexity (and there usually is) wouldn’t it be wise to deal with those issues instead of inconveniencing your customers?

Comcast Error Messaging

Even if Comcast chose to go ahead as planned, they could have avoided in-bound customer service issues by applying proper error messaging.

A simple statement about a system upgrade requiring users to re-register would have made the situation clear. Inconvenient but at least Comcast would have provided an immediate answer to the problem they created.

But here’s the rub. You need your account number to register!

That account number is not on the billing email and since I use paperless billing I have no material with my Comcast account number. No matter what, I’d still have to contact customer support to retrieve my account number.

Comcast Customer Service

Despite the buzz Comcast has generated around their use of Twitter they still don’t seem to understand customer service.

If they did, they’d have created use cases from a customer perspective and realized that this upgrade would be detrimental for users and would cause added customer support costs.

AT&T U-verse is looking better all the time.

Twitter is the Underpants Gnomes of the Internet

October 29 2009 // Rant + Social Media // 3 Comments

The other day I read Steven Hodson’s Shooting At Bubbles post regarding Twitter 2.0. And it finally dawned on me!

Twitter is the Underpants Gnomes of the Internet

If you’re not familiar with the Underpants Gnomes, they were featured in a South Park episode in which the Gnomes devised an … interesting business plan.

underpants gnomes

Replace underpants with users (or VC cash) and you’ve got Twitter. Oh, sure they’ve alluded to some sort of business plan, but even as recently as a few weeks ago Evan Williams wasn’t willing to divulge a real revenue model despite John Battelle’s prodding.

I remember the Web 1.0 days of grow fast, grab market share and monetize later. Only a few survived this kowabunga style of business.

But who knows, maybe Twitter can turn underpants into profit.

Google Search Innovation Accelerates

October 27 2009 // SEO // Comments Off on Google Search Innovation Accelerates

It’s not your imagination. Google’s rate of search innovation has accelerated. Google has marched through shopping Onebox enhancements (9/14), expanded forum listings (9/14), anchor based jump to and site links (9/25), Search Options expansion (10/1), Quick View PDFs (10/7), Twitter integration (10/21) and Social Search (10/26).

You could even include ancillary search innovations like Sidewiki (9/23) and fetch as Googlebot (10/12).

Gooogle Search

It’s been so fast that Google launched a series of weekly blog posts (This Week in Search) to keep up with the changes.

Pent Up Search Innovation?

For the past few years Google has stated in numerous venues that they didn’t particularly want to grow their share of the search market. They explained that extending their market share lead could incur the wrath of government oversight or create user backlash.

The jaded in the audience rolled their eyes and even the purists probably furrowed their brow. Sure, it makes a bit of sense but … really?

Yet, the timing of these recent search innovations is intriguing.

Bing Allows Google To Innovate

Microsoft’s latest dedication to search could be the reason for Google’s recent spate of search enhancements. Before Bing went live Google rolled out rich snippets, making Google last to the rich listings party. The preemptive move secured a certain search parity in the space.

That box checked off, Google sat back and watched as Bing innovated, advertised and finally partnered with Yahoo. Bing’s share began to inch up toward 10% and that seemed to be enough for the search behemoth to release the hounds engineers.

One gets the impression that projects that had once been stashed in a drawer were suddenly taken out, dusted off and quickly deployed.

What else does Google have in store for us?

Does Keyword Density Matter?

October 26 2009 // SEO // 3 Comments

Search Engine Optimization (SEO) is filled with passionate debate. One such debate is the validity of keyword density as an SEO strategy.

keyword density

Keyword Density is Dead

The well respected SEOmoz says keyword density is a myth.

A complete myth as an algorithmic component, keyword density nonetheless pervades even very sharp SEO minds. While it’s true that more usage of a keyword term/phrase can potentially improve targeting/ranking, there’s no doubt that keyword density has never been the formula by which this relevance was measured.

Yet, looking at their 2009 Search Engine Ranking Factors report you’ll note that keyword density does appear as a factor. In addition, you’ll see factors such as using the keyword in the first 50-100 words of text and keyword repetition in the text.

So, what’s going on?

Keyword Targeting Matters

keyword targeting

Keyword targeting helps a search engine understand what a page is about.

There is a fringe element who would have you believe that unique content is enough. They assume that the search engine will understand the content as written, no keyword targeting required.

But how does a search engine understand content? A search engine doesn’t read the text on the page like you or I. They’ll always fail a reading comprehension test. No, search engines have to assess and understand content in a mathematical way.

Keyword Targeting Strategies

While search engines use a mathematical approach to understanding text, they try to emulate human reading behavior. The use of the keyword in the title tag is important, just as the title of a book is important. Imagine the confusion of an organic chemistry textbook titled ‘Sunshine Boogie’.

The use of the keyword in the H1 tag is essentially a chapter heading. Headers are usually the biggest text on the page and size matters since it conveys the relative importance of text on the page. To a lesser extent, any emphasized text (bold, italics) may also provide a clue to search engines as to comparative relevance.

Using the keyword as the first word in the title as well as the H1 and in the first 50-100 words of text points to a bias toward natural left to right reading. What’s on the left – first or early on – is most important.

And then there’s the repetitive use of the keyword in the text. If you’re reading a chapter on the history of the Crusades you’re likely to see the word ‘crusades’ quite a bit. A blog post on Twitter is going to have the word ‘Twitter’ repeated numerous times. The repetition helps to define the topic of that content.

Keyword Targeting Measurement

Search engines may use a number of different methodologies to extract meaning and determine keyword targeting. Proximity, distribution and co-location have been around for ages. I understand linearization, tokenization, filtration and stemming and grok lexicographic analysis. Yet, I’m not sure to what extent search engines really use any of these methodologies.

I’m particularly suspect of linearization since Google has indicated it can distinguish and weight text and links by where they reside on the page.

But enough about the search engine, how does the every day person measure keyword targeting?

You’ve written a new piece of content – an article or blog post. You know what it’s supposed to be about, but did you stray too far from your original keyword target? You know you’re supposed to use the keyword a number of times but did you use the keyword enough?

Long Live Keyword Density

Like it or not the easiest way to determine if you used the keyword enough times is … keyword density. When I use keyword density I’m looking at the percentage and frequency of that keyword compared to other non-stop words. Usually I’m not even looking at numbers but instead use Wordle for visual keyword density.

Remember, you’re not using keyword density in a vacuum!

Getting the keyword density from 4% to 5% probably won’t do squat. And keyword density isn’t going to matter if you simply stuff the keywords into an incoherent, keyword bloated final paragraph.

But if you have 5 or 6 keywords with the same general density, or if your target keyword has a density of under 1% you’ve got some problems. The search engine will be confused. It might figure it out (with or without additional SEO help), but why take the chance when you have full control over this part of the optimization!

If a number of keywords have the same density, which can happen when you have a very long piece of content, you can choose to target the ‘right’ keyword. Or you may come to the conclusion that the content covers too many topics and should be split (or paginated) into smaller, more focused articles.

If the keyword density is low, it’s a reminder to review your writing to look for dreaded pronouns, to make sure you’re using the keyword in the first and last paragraphs and to remove any leaps in logic.

Keyword density isn’t about density it’s about enforcing proper keyword targeting.

Keyword Density and Readability

Research shows that people scan web content. They very rarely read word by word. People are in a hurry and our broadband tabbed browser environment allows us to go even faster than before.

Steve Krug’s Don’t Make Me Think! has this to say about scanning content.

The net effect is a lot like Gary Larson’s classic Far Side cartoon about the difference between what we say to dogs and what they hear. In the cartoon, the dog (named Ginger) appears to be listening intently as her owner gives her a serious talking-to about staying out of the garbage. But from the dog’s point of view, all he’s saying is “blah blah GINGER blah blah blah blah GINGER blah blah blah.”

gary larson on keyword density

So, when someone comes to a piece of content, you need them to instantly understand what it’s about. Sure, a big H1 with the keyword goes a long way but if they’re scanning the text, you’ll want that keyword to show up numerous times. Don’t make them search for Ginger.

Keyword repetition helps the user make a conscious determination that the content is about that keyword. Unconsciously you’re bouncing a keyword rich visual image off the user’s retina. In a five second test environment, that type of visual osmosis might count for something.

Is it a stretch to think that keyword density might help users read content? No.

The real myth is that keyword density degrades content, when in fact it often does the opposite. Well written keyword dense text is generally easier to read.

Why Keyword Density Matters

Keyword density matters because it is an easy measure for keyword repetition and helps users access and engage in your content. It is not the biggest ranking factor (by a country mile), but keyword density is completely in your control and is one of the building blocks upon which other techniques are balanced.

Keyword density isn’t about hitting a certain percentage, it’s about ensuring that your content is highly focused and easy to read.

Are you Canadian?

October 13 2009 // Advertising + Marketing + Web Design // 6 Comments

Two weeks ago my wife dug out some Fall felt stickers from the closet for our daughter. Delighted, my daughter stuck one of them on my shirt before I walked out the door to work.

It was a red leaf.

Red Leaf

Being a bit sentimental, I left the red leaf on my shirt for the day. That decision led to a renewed appreciation for the power of iconography.

Are You Canadian?

That was one of the first things a co-worker asked me that day. The question seemed rather random and out of left field. In reaction to my puzzled expression, he pointed to the red leaf.

Later that day another co-worker asked if I liked maple syrup. And yet another started up a conversation about the upcoming hockey season. (Go Flyers!)

The Power of Iconography

I suppose it does look like the Canadian maple leaf. And the sticker was out of place and likely attracted attention. But all that aside, I wasn’t wearing a Canadian flag sticker. It was a simple, small red leaf.

The meaning that this red leaf conveyed was impressive. A single red leaf created an instant association with Canada and then, like a needle skipping on a vinyl record, to maple syrup and hockey. A stream of data, of experience, of knowledge, was trapped inside that red leaf.

How does that happen?

Semiotics

Semiotics, or the study of signs, helps explain how a simple red leaf can have such a profound impact. The field of semiotics is both dense and ambiguous, filled with academic rhetoric and debate. Even a beginner’s guide to semiotics clocks in at over 5,500 words.

The main elements of semiotics are syntax, semantics and pragmatics, described in as follows in an icon design article.

  • Syntax: the internal grammar of parts that enable a properly formed sign to be parsable by someone or some system—think of the computer throwing a “syntax error”
  • Semantics: the intending meaning of the sign by the maker(s) of it
  • Pragmatics: how the sign is received, perceived, and acted upon by some person or interpreter by the confluence of syntax and semantics; the resulting effect

Pragmatics is where it really gets interesting in my opinion. Pragmatics deals with the impact of context and experience as it is applied to the perception of signs.

Now, back to the red leaf.

The syntax is fine. You know it’s a leaf. However, the intended meaning (semantics) changed through the prism of pragmatics.

The bag of colored leaves was intended to be a sign of Fall. Yet, the one red leaf taken out of context is instead perceived to be a sign of Canada, which opened up a whole new flood of associations based on personal experience and perspective.

Icons and Marketing

The Internet is a vast landscape of icons. My red leaf experience reminds me that iconography can be a very effective marketing tool. It is done wrong and badly for the most part, but when done well can have a tremendous impact. (For more on semiotics and advertising I recommend the retro semiotics hypercard essay from Thomas Streeter at the University of Vermont.)

The challenge is figuring out how to create icons that unlock that stream of data. Creating icons that tap into shared experiences and personal histories can deliver a tone to your website that you simply can’t convey otherwise.

The Icon Test

Sometimes you’re capitalizing on ancient archetypes and sometimes they can be recent and repetitive shared experiences. Don’t believe me? Lets try a little icon experiment. I’m going to show you an icon of sorts.

Tell me what you instantly think about after seeing it.

Twitter Bird Icon

Post the first three words that came into your mind in your comment.

xxx-bondage.com