You Are Browsing The Technology Category

How Not To Use Twitter

December 07 2008 // Humor + Rant + Social Media + Technology // Comments Off on How Not To Use Twitter

I’m still trying to find how to get the most out of Twitter. (I get far more from FriendFeed.) But here’s an easy example of how not to use Twitter.

The level of noise on Twitter seems high. Ditto the number who feel it’s an obligation to follow back. Or perhaps this is just what increased usage gets you?

Wikipedia Flirts With Disaster

December 05 2008 // Technology // Comments Off on Wikipedia Flirts With Disaster

Wikipedia has secured a $890,000 grant from Stanton Foundation to make the editing process more user-friendly. At first glance this seems innocuous, but it has the potential to ruin Wikipedia.

“Wikipedia attracts writers who have a moderate-to-high level of technical understanding, but it excludes lots of smart, knowledgeable people who are less tech-centric,” Sue Gardner, the Wikimedia Foundation’s executive director, said in a statement Wednesday.

Ms. Gardner is right. But it also excludes the ubiquitous trolls and troglodytes who inhabit the comment space on social networks and forums. Ever peek at the comments on YouTube? Yeah, not the type you want editing material on Wikipedia.

Here’s today’s featured article on Wikipedia.

Now imagine that the monkeys have overwhelmed the safeguards through new user-friendly editing.

Sure it’s an exaggeration but the fact remains that Wikipedia is inviting more people into the editing process. They don’t get to choose just the smart people.

The current editing environment actually provides a filter via the technical and code obstacles. It’s a screening process that, while somewhat biased, helps keep the monkey masses out. Might it not be easier to provide tutorials for the ‘smart’ people who are motivated to contribute to Wikipedia?

User-friendly editing is a can of worms, a Pandora’s box. Wikipedia should be careful for what it wishes.

Forgive Me StumbleUpon For I Have Sinned

December 04 2008 // Advertising + Humor + Marketing + Technology // 1 Comment

StumbleUpon No EntryThe other day I updated my StumbleUpon toolbar (well, I was essentially forced to) and immediately couldn’t Stumble posts from this blog or my Used Books Blog. Each time I tried my Stumble just would not go through, stalling at a blank white box where the review and tagging takes place. I tried numerous times on a couple different browsers. Nothing worked.

I assumed that something had gone awry with the new toolbar. I even posted a message on FriendFeed calling eBay lame. But you know the old saying about assuming, right?

I sent feedback to StumbleUpon about my problem and got a prompt reply as follows.

Hello,

Thanks for writing in.

After reviewing your account history, it appears
that you’ve repeatedly submitted content from one
or more sites in particular.

Our site software detects behavior like this to
prevent the unauthorized use of StumbleUpon to
promote a specific Web site, product or service.

This limit will likely remain in place until you
use the StumbleUpon Toolbar more frequently to
rate, review and discover Web sites that can
shared with other members.

If you’re interested in using StumbleUpon to
advertise a Web site, please look into our
Advertising program:
http://www.stumbleupon.com/ads

If you have any other questions, please review our
Terms of Service and Community Rules:

http://www.stumbleupon.com/terms/
http://www.stumbleupon.com/rules.html

Thanks for your feedback,

Oops. I admit, I’ve only been Stumbling my own sites lately. However, I think they’re pretty good so I don’t see anything too wrong with that. Also, I’m pretty transparent. I don’t have a Stumble army nor do I have multiple profiles so I can distribute my Stumbles across accounts and dodge the software.

But I get it and I’m not really complaining. It’s not what StumbleUpon is really supposed to be about. And I respect them for protecting the product and StumbleUpon business model. It also got me to Stumble again and I discovered some interesting sites and images. So … thanks for that.

My one nit would be that I had to contact support to get this information. Instead of presenting the white box of frustration I suggest that StumbleUpon simply insert the text I received into that area. Not only would I have immediately understood what was going on and not throw invective into the htmlosphere but StumbleUpon would have saved a bit of money on customer support.

Forgive me StumbleUpon for I have sinned. My penance? Stumble.

Does Google Have Pac-Man Fever?

December 02 2008 // Humor + SEM + SEO + Technology // 2 Comments

Google’s share of US searches continues to rise according to a recent comScore press release. In October 2008 Google led with 63.1% of all searches conducted. The resulting pie chart shows that Google is closing in on a Pac-Man like position in the search market.

It hasn’t been this way for that long though. Following is the historic comScore data I’ve cobbled together showing Google’s share of the search market.

October 2004: 34.8%
October 2005: 39.0%
October 2006: 45.4%
October 2007: 58.5%
October 2008: 63.1%

So, in five years the search market went from a dog fight to a laugher. If Google continues on this path the pie chart will take on true Pac-Man dimensions.

Now, I’m not sure who’s Inky, Blinky, Pinky or Clyde but Google certainly has the other search players on the run.

None of them seems to have the right medicine to reduce the Google fever that has swept the country. Acetaminophen (AOL), ibuprofen (Yahoo!), naproxen (Ask) and aspirin (MSN) have all failed to bring the temperature down. And upstart homeopathic remedies (Powerset, Cuil etc.) haven’t made a dent either.

Would mixing some of these medicines together help? Some Yahoo! and MSN with a dash of Powerset? Not likely. And in some cases mixing medicines can prove lethal.

Could Zoetrope Be A SERP Tracking Tool?

November 25 2008 // SEO + Technology // Comments Off on Could Zoetrope Be A SERP Tracking Tool?

Zoetrope is a new web tool being jointly developed by Adobe and researchers at the University of Washington. The general idea is to allow users to view the Internet over time. Think of it as the Wayback Machine on steroids. Sarah Perez does a good job writing about Zoetrope in ReadWriteWeb.

Yet, it’s the following video that does the best job of explaining Zoetrope.

Could Zoetrope be used as a SERP tracking tool?

If Zoetrope could capture specific search engine results pages (SERPs), then it could be a very powerful SERP tracking tool.

Let’s say I have 10 high value keywords for which I want to be highly ranked on Google. Using Zoetrope I could conceivably capture the daily SERP for each keyword and link it to the appropriate destination page. This would allow me to see if changes to my page had any impact on SERP ranking for that keyword.

I could even create ‘lenses’ for my competitors and review how they’ve tweaked their site to help influence SERP rank. Zoetrope could provide an unparalleled level of detail and analysis for a savvy SEO practitioner.

Yet there’s a big, actually huge, ‘if‘ in the statement above. I doubt that Zoetrope is or even could capture every SERP. But that doesn’t mean there’s not a way to do this. In fact, I think it provides a pretty interesting business opportunity.

Want Zoetrope to help you track SERP for a group of keywords? Simply sign-up for a subscription and they would begin to capture the information. Toss in a 30 day free trial to get the ball rolling and I think you’d have a number of people clamoring for and using Zoetrope for SERP tracking.

I Love Firefox Metrics

November 25 2008 // Marketing + Technology // Comments Off on I Love Firefox Metrics

I’m a direct marketer at heart which means I like numbers and I absolutely love testing. The other day I found the Mozilla Blog of Metrics via a FriendFeed post by AJ Batac. The blog is subtitled ‘When in doubt, sample it out …’ and gives me yet another reason to like, perhaps even love, Firefox and Mozilla.

Mozilla is moving forward with multivariate testing but wanted to crawl before they walked. So, they conducted a simple A/B test on the call to action on their download button.

Now, I probably could have told you that ‘Download Now – Free’ would outperform ‘Try Firefox 3’ However, they admit that the test was more about validating the tool and process rather than about the actual test.

What’s exciting (to me at least) is that Mozilla is going to run these tests and (hopefully) follow a fact-based decision making process. Do any of us want to sit through another meeting where the merits of design or copy are debated endlessly? Too many organizations continue to design based on instinct, opinion and best practices instead of testing their way to success.

You and your team are not the target market, and best practices may not be best for your particular product, site or audience. What worked at your last place of business may not work again. In fact, what worked last year may not work now, which is why organizations should always re-test assumptions via challengers.

If we’ve piqued your interest, please note that we’ll soon have some exciting findings related to a currently ongoing multivariate test at the main Firefox product page (www.mozilla.com/firefox).

Yes, you’ve piqued my interest! I’ll be eager to see the results of multivariate testing and hope that Mozilla can help usher more organizations into a test-and-learn, fact-based decision making environment.

Like Web 2.0? Thank A Marketer Today

November 21 2008 // Marketing + Technology // Comments Off on Like Web 2.0? Thank A Marketer Today

What exactly is Web 2.0? There are plenty of ways to explain it and define it, but what is it really? It’s a marketing slogan meant to attract investors, media and users back to the Internet after the dot com crash.

The inability to adequately define what Web 2.0 is a clue that it has roots in marketing. Marketers (and I’m one of them) are good at creating things that resonate without making complete sense. For instance, the phrase ‘virtually spotless’ for a dishwasher detergent seems good but upon inspection really means ‘has some spots’.

Max(x) Barry aptly skewered this situation in his book Syrup.

Pick a random chemical in your product and heavily promote its presence. When your customers see “Now wth Benzoethylhydrates!” they will assume that this is a good thing.

Even Tim Berners-Lee, Web pioneer, sees through the Web 2.0 smoke and mirrors. Ars Technica does a good job of digesting Tim Berners-Lee’s podcast text and presents the most relevant quote about his view of Web 2.0: “nobody even knows what it means”.

But that’s the brilliance of Web 2.0, it can mean whatever you really want it to mean. Want Web 2.0 to include the semantic web? Sure. Want it to be about microformats? You betcha. Want it to be about user generated content? No problemo. Want it to represent a way to use software and technology to connect people to people? Of course. Mashups? Yup. Tagging? Check. Social Media? Okay.

The mythology is that Tim O’Reilly and Dale Dougherty came up with the term at a brainstorming session at a 2003 conference. I don’t think they came up with the term, but O’Reilly created the buzz around it, promoted it and subsequently fought over it. Let’s face it, O’Reilly isn’t a slouch in the marketing department!

Don’t believe me? Well what about Dermot McCormack’s 2002 book titled Web 2.0: The Resurgence of the Internet & E-Commerce.

So instead of trying to figure out who coined the term or what it actually means, just be happy that it ushered a new influx of ideas and investment into the Internet.

Like Web 2.0? Thank your nearest marketer today.

SearchWiki Turns You Into Free Mechanical Turk

November 21 2008 // SEO + Technology // 1 Comment

Yesterday Google launched SearchWiki, a search feature that lets users customize search results by moving, deleting, adding and commenting on search results. The search algorithm will now have access to a set of aggregated human data. As I’ve written about before, the Google search algorithm will benefit from having a human feedback mechanism.

Google SearchWiki turns users into a free Mechanical Turk.

Not familiar with Amazon’s Mechanical Turk? Here’s a relevant excerpt from Wikipedia:

The Amazon Mechanical Turk (MTurk) is one of the suite of Amazon Web Services, a crowdsourcing marketplace that enables computer programs to co-ordinate the use of human intelligence to perform tasks which computers are unable to do.

Now, clearly the search algorithm is able to produce search results. However, the algorithm still isn’t very smart. SearchWiki lets the algorithm learn as humans move, delete and add results for specific search queries.

While Google clearly states that your specific changes will not be seen by others, it seems impossible to think that Google won’t use that information to influence search results over time. Not convinced? SEO by the Sea reviews a recent Google patent that points to Google’s continuing goal of leveraging more data and behavior into search results.

Google has been experimenting with something like this for some time and finally seems ready for prime time. This also means the flirtation with Digg is likely done for good unless abuse by SEO practitioners overwhelms the signal.

How do SEO gurus react to SearchWiki?

If you employ a short term chase the algorithm type of SEO you’re seeing a threat and an opportunity. The threat is that the human feedback mechanism could help to curb over-optimization and subtle gaming. However, it also provides an opportunity to create a SearchWiki army that could coordinate changes to specific search results. The worst case scenario is that people are paid to delete, add or move certain results for specific searches.

I’m not saying SearchWiki is a bad thing. But make no mistake, Google is using it as a free way to make their core product better.

Search Pogosticking and SEO

November 14 2008 // SEO + Technology // 18 Comments

Search pogosticking is defined as going back and forth from a search engine results page (SERP) to individual search result destination sites. The behavior may indicate poor search results since the user hasn’t been satisfied by one or more of the SERP results.

Google and Yahoo are clearly using, or thinking about using, this metric as part of their search algorithm. It makes a lot of sense and could provide an important user-defined input for the algorithm as well as guard against potential over-optimization. As an aside, access to this human feedback mechanism may be one of the reasons why Google hasn’t been eager to pay a premium for Digg.

How would search pogosticking influence SEO?

A user is presented with search results based on a specific query. The engine captures what result you click on and whether you return to that SERP and click on subsequent results and/or refine your query. (They could even conceivably determine the time between each click as a proxy for satisfaction with that result. This would reduce the chances of penalizing results that did deliver value.) The information can be aggregated for each query and compared to average pogosticking behavior by SERP rank.

So, let’s say that for a specific query the engine sees that the current SERP has an abnormally high pogostick rate for the top ranked result. This information could then be fed back into the algorithm as a negative signal, thereby reducing its SERP rank in a future algorithm update. Obviously, it would have to be statistically above the average pogostick rate for that position to be flagged. However, I’m certain both Google and Yahoo have smart folks who can calculate when it reaches significance.

Does search pogosticking sound far fetched?

It shouldn’t. Do a Google search and look at the page source. You’ll see something like the following next to each search result.

onmousedown="return rwt(this,'','','res','1'

The ‘res’ likely stands for result and the next parameter is the actual rank for that result on that page. More information on this rewrite and tracking behavior can be found at searchlores and blogmal.

Even without this technical knowledge it should be obvious they’re doing something like this if they’re providing users with customized search results. I’m not sure the ability to provide custom or personalized results is the true aim, but makes the collection of this information more palatable for many users.

SEO by the Sea recently posted about this topic including a relevant pogosticking patent application by Yahoo. I recommend you read his post as well.

How can I track pogosticking?

You can’t know for sure whether a user is pogosticking. Bounce rate can sometimes be a good indication, but there are instances where query intent is satisfied and a high bounce rate is the expected result. Many Q&A sites meet this criteria.

The best course of action is to review pages with high bounce rates and make certain you’re matching query intent and delivering value to those users. Post-click SEO is going to become a larger part of the equation, bluring the line between SEO and traditional design and UI.

Are Banner Ads Dying?

November 12 2008 // Advertising + Marketing + Technology // 1 Comment

Are banner ads dying?

I remember people shouting about this during the Web 1.0 hey day and there have been plenty of folks who have since cried wolf on the topic. Each time banner ads come back from the proverbial dead, walking the Internet with zombie like efficiency. Is this time any different?

Maybe.

More and more research indicates that young users are not responding to banner advertising. But can we blame them? We’re entering an era in which a whole generation has grown up with the Internet. At an increasingly young age children begin to surf the Internet. Yet, the sites they visit often have far less clutter and advertising than the traditional site.

Are we creating a generation of banner intolerant Internet users?

Those of us, myself included, who came to the Internet as young adults have been exposed to a higher degree of ad clutter from the start. We’re used to it. Not only that, our context for approaching the Internet was shaped through television. Television conditioned us to expect advertising to be part of the equation.

Yet, again, this is not the case for a new generation that has grown up with TiVo and other DVR products. These services have undermined normal TV watching patterns and preconceptions. Every day more of us are conditioned to simply hit the 30 second skip button when presented with a commercial. (As an aside, thank you to Fringe for telling me how many seconds the break is going to be. That’s very handy!)

These habits are particularly important since new research indicates that the heaviest TV viewers are also the heaviest Internet users.

Are social networks part of the problem?

Yes.

Social networking is one of (if not the) largest activity for the young. Sites like Piczo, MySpace and Facebook, among others, are like the middle school and high school of the Internet. These sites are, at their core, utilitarian in nature. They’re about communicating. They’re about making ‘friends’.

Some of them do use banner ads, but more and more evidence shows a growing banner blindness on these sites. I suspect the high number of visits (for a very specific purpose) exacerbates banner blindness on social networks.

If you use CPM rates as a proxy for effectiveness then it becomes obvious that social networks are in distress.

Can we blame Twitter?

Yes.

Twitter and even my current addiction, FriendFeed, contribute to the problem by not using banner ads. They are but one more site, one more application, one more widget that is providing a valued service for … nothing. Clearly advertising isn’t the only way to monetize these sites, but by not implementing any monetization strategy they send a signal to users that they can get something for nothing.

The ‘something for nothing’ mentality makes us less tolerant of advertising. I can surf the Internet with Firefox, supercharge my blog with WordPress plugins and use Google Analytics to track metrics for a slew of client sites all … for not one penny.

This isn’t a long term problem. A type of Internet Darwinism will take place where those without a true business (aka revenue) model will either fail or try to implement some sort of real revenue strategy. In most instances, that will be advertising or subscriptions.

Will it be advertising or subscriptions?

Just before the collapse of Web 1.0, I worked at Bluelight.com, the online version of Kmart. One of their major initiatives was an ad supported free ISP. We had millions of users! The thing is, it didn’t really work. They wound up converting it to a subscription based service which was ultimately acquired by United Online

Today, we’re conditioning a generation to ignore banner and display advertising. The cat is out of the bag. The genie is out of the bottle. So even if we want to return to it as a revenue stream it is becoming an ever weaker medium. And we only have ourselves to blame.

So perhaps subscription based sites, or networks are the wave of the future. Would you pay for Twitter?

Will banner ads die?

Of course not. Banners will never completely die, but a few things will have to happen for them to rise again to feast on the glorious eyeballs of Internet users.

They’ll need to be more engaging and use rich media in more appropriate ways. More importantly, the industry will need to measure banner ads not by CTR or traditional click-based ROI but by brand measurement metrics. There are a few companies who provide this service, though my favorite is Vizu Ad Catalyst. (Disclosure: I worked at Vizu for a short time but stand by the fact they are leaders in this new field.)

The current economic climate has forced many analysts to adjust their online advertising industry forecasts, some of them twice. Yet, I’m unsure any of them are accounting for this new generation of Internet users who are blind to banners at best and intolerant of them at worst.

xxx-bondage.com