You Are Browsing The Analytics Category

Google Analytics Userscripts

February 17 2011 // Analytics // 1 Comment

If you spend a lot of time in Google Analytics you may quickly find yourself frustrated with the user experience. Here are 3 userscripts that make using Google Analytics way more efficient.

What are Userscripts?

Userscripts are small pieces of JavaScript code that tweak or provide additional functionality to your web experience. You install userscripts as a simple add-on in Chrome, Firefox (requires Greasemonkey) or Internet Explorer (requires IE7Pro).

In a nutshell, userscripts make things better. A lot better.

Cleaner Profile Switching

This userscript lets you switch from one Google Analytics profile to another and see the same report. It also gives you the option of opening that new profile in a separate tab.

Cleaner Profile Switching Userscript

This is a huge time saver if you’ve got multiple profiles (which you should) since you won’t have to build the report from scratch each time.

Get it: Cleaner Profile Switching

Absolute Conversion

This userscript calculates and displays the number of conversions next to the conversion rate.

Absolute Conversion Userscript

So instead of navigating to the Goals menu or doing some math in your head, you can quickly see your conversion numbers. Please note that while this is a handy userscript, it breaks when Google Analytics samples data.

Get it: Absolute Conversion Userscript

Accordion Menu

This userscript makes all of the top level Google Analytics menus expandable without waiting for the browser to reload.

If you use Google Analytics often, you probably get tired of clicking on main section report titles, only to wait for it to load so you can click on sub-reports. Think about it, how many times have you clicked on “Traffic Sources” with the full intention of clicking on “All Traffic Sources” as soon as possible? Or “Content” just to get to “Top Content”.

This userscript is a massive time saver.

Get it: Accordion Menu Userscript

Using Userscripts

I should warn you that userscripts can sometimes be janky and cause problems. In fact this post was originally going to feature four userscripts until I noted a problem with one of them. Don’t let this keep you from trying them out. Userscripts are super easy to uninstall and many of the creators are eager to get feedback on how to improve them.

Give these Google Analytics userscripts a try and let me know if you have any others you swear by.

SEO Status Codes

January 20 2011 // Analytics + SEO // 5 Comments

One of the more technical aspects of SEO is to understand, monitor and manage status codes.

Soup Nazi 400 Bad Request

What are Status Codes?

Status Codes are an essential part of HTTP, the request-response protocol that powers the Internet. Each time someone visits a page (including Googlebot) they ask the site for information. The status code is a numeric response to that request and provides guidance on how to proceed. You might be familiar with status codes such as 404 and 301.

SEO Status Codes

I recommend bookmarking the status code definitions documented by the W3C. However, I want to provide a quick reference guide specifically for SEO.

200

OK or Success. This is the response code you want to see most often. At a minimum, I want Googlebot to see a 200 response code in 90% or more instances during a crawl.

301

Moved permanently. This is the right way to redirect, telling search engines to index that content in a new location.

302

Moved temporarily. This is the wrong way to redirect (except in very rare cases). You’re essentially putting this content into limbo because it’s not at the current location but search engines won’t index the temporary location.

304

Not modified. This can be used for crawl efficiency, telling search engines that the content has not changed. You’re basically telling Googlebot not bother and to move on to other content. The advent of Caffeine may have made this unnecessary but I think it’s still worthwhile.

404

Not found. This happens when the client can’t find the content at that specific location. Too many 404s are bad. In my experience having too many is a negative algorithmic signal. Google simply doesn’t trust that sending a user to that site will be a positive experience.

I don’t have a hard and fast number for when 404s become problematic. I believe it’s probably based on a percentage of total requests to that site. As such, it’s just good practice to reduce the number of 404s.

That does not mean zero! I don’t recommend putting a 301 in place when it should return a 404. A request for domain.com/foo should return a 404. Ditto for returning a 200 when it should be a 404. (Yes, I’ve seen this lately.) I’d be surprised if having no 404s wasn’t also some sort of red flag.

410

Gone. If you know that content no longer exists, just say so. Don’t encourage Googlebot to come back again and again and again via a 404 which doesn’t tell it why that page no longer exists.

500

Internal Server Error. This generally means that the client never received an appropriate response from the site. 500 errors basically tell the search engine that the site isn’t available. Too many 500 errors call into question the reliability of that site. Google doesn’t want to send users to a site that ultimately times out and doesn’t load.

How to Track Status Codes

There are a number of ways you can track status codes. For spot checking purposes, I recommend installing one of two Firefox add-ons: HttpFox or Live HTTP Headers. These add-ons let you look at the communication between user agent and client. For example, what happens when I type ‘www.searchengineland.com’ directly into my browser bar.

HttpFox Example

Using HttpFox I see that it performs a 301 redirect to the non-www version and then resolves successfully. Google Webmaster Tools also provides you with nice insight through the Crawl Errors reporting interface.

But if you really want to use status codes to your benefit you’ll need to count and track them every day via log file analysis. I recommend creating a daily output that provides the count of status codes encountered by Googlebot and Bingbot.

Status Code Reports

Using those daily numbers you can construct insightful and actionable dashboard graphs.
Sample Status Code Reports

While this may take some doing, the investment is worthwhile. You can quickly identify and resolve 404s and 500s. Many will find it helpful to have this data (concrete numbers!) so you can prioritize issues within a larger organization.

You’ll also gain insight into how long it takes search engines to ‘digest’ a 301 and much more. Status code management can be a valuable part of an advanced SEO program.

Optimize Your Sitemap Index

January 11 2011 // Analytics + SEO // 13 Comments

Information is power. It’s no different in the world of SEO. So here’s an interesting way to get more information on indexation by optimizing your sitemap index file.

What is a Sitemap Index?

A sitemap index file is simply a group of individual sitemaps, using an XML format similar to a regular sitemap file.

You can provide multiple Sitemap files, but each Sitemap file that you provide must have no more than 50,000 URLs and must be no larger than 10MB (10,485,760 bytes). [...] If you want to list more than 50,000 URLs, you must create multiple Sitemap files.

If you do provide multiple Sitemaps, you should then list each Sitemap file in a Sitemap index file.

Most sites begin using a sitemap index file out of necessity when they bump up against the 50,000 URL limit for a sitemap. Don’t tune out if you don’t have that many URLs. You can still use a sitemap index to your benefit.

Googling a Sitemap Index

I’m going to search for a sitemap index to use as an example. To do so I’m going to use the inurl: and site: operators in conjunction.

Google a Sitemap Index

Best Buy was top of mind since I recently bought a TV there and I have a Reward Zone credit I need to use. The sitemap index wasn’t difficult to find in this case. However, they don’t have to be named as such. So if you’re doing some competitive research you may need to poke around a bit to find the sitemap index and then validate that it’s the correct one.

Opening a Sitemap Index

You can then click on the result and see the individual sitemaps.

Inspect Sitemap Index

Here’s what the sitemap index looks like. A listing of each individual sitemap. In this case there are 15 of them, all sequentially numbered.

Looking at a Sitemap

The sitemaps are compressed using gzip so you’ll need to extract them to look at an individual sitemap. Copy the URL into your browser bar and the rest should take care of itself. Fire up your favorite text program and you’re looking at the individual URLs that comprise that sitemap.

Best Buy Sitemap Example

So within one of these sitemaps I quickly find that there are URLs that go to a TV a Digital Camera and a Video Game. They are all product pages but there doesn’t seem to be any grouping by category. This is standard, but it’s not what I’d call optimized.

Sitemap Index Metrics

Within Google Webmaster tools you’ll be able to see the number of URLs submitted and the number indexed by sitemap

Here’s an example (not Best Buy) of sitemap index reporting in Google Webmaster tools.

Sitemap Index Metric Sample

So in the case of the Best Buy sitemap index, they’d be able to drill down and know the indexation rate for each of their 15 sitemaps.

What if you created those sitemaps with a goal in mind?

Sitemap Index Optimization

Instead using some sequential process and having products from multiple categories in an individual sitemap, what if you created a sitemap specifically for each product type?

  • sitemap.tv.xml
  • sitemap.digital-cameras.xml
  • sitemap.video-games.xml

In the case of video games you might need multiple sitemaps if the URL count exceeds 50,000. No problem.

  • sitemap.video-games-1.xml
  • sitemap.video-games-2.xml

Now, you’d likely have more than 15 sitemaps at this point but the level of detail you suddenly get on indexation is dramatic. You could instantly find that TVs were indexed at a 95% rate while video games were indexed at a 56% rate. This is information you can use and act on.

It doesn’t have to be one dimensional either, you can pack a lot of information into individual sitemaps. For instance, maybe Best Buy would like to know the indexation rate by product type and page type. By this I mean, would Best Buy want to know the indexation rate of category pages (lists of products) versus product pages (an individual product page.)

To do so would be relatively straight forward. Just split each product type into separate page type sitemaps.

  • sitemap.tv.category.xml
  • sitemap.tv.product.xml
  • sitemap.digital-camera.category.xml
  • sitemap.digital-camera.product.xml

And so on and so forth. Grab the results from Webmaster Tools and drop them into Excel and in no time you’ll be able to slice and dice the indexation rates to answer the following questions. What’s the indexation rate for category pages versus product pages? What’s the indexation rate by product type?

You can get pretty granular if you want though you can only pack each sitemap index with 50,000 sitemaps. Then again, you’re not limited to just one sitemap index either!

In addition, you don’t need 50,000 URLs to use a sitemap index. Each sitemap could contain a small amount of URLs, so don’t pass on this type of optimization thinking it’s just for big sites.

Connecting the Dots

Knowing the indexation rate for each ‘type’ of content gives you an interesting view into what Google thinks of specific pages and content. The two other pieces of the puzzle are what happens before (crawl) and after (traffic). Both of these can be solved.

Crawl tracking can done by mining weblogs for Googlebot (and Bingbot) by the same sitemap criteria. So, not only do I know how much bots are crawling each day I know where they’re crawling. As you make SEO changes, you are then able to see how it impacts the crawl and follow it through to indexation.

The last step is mapping it to traffic. This can be done by creating Google Analytics Advanced Segments that match the sitemaps using regular expressions. (RegEx is your friend.) With that in place, you can track changes in the crawl to changes in indexation to changes in traffic. Nirvana!

Go to the Moon

Doing this is often not an easy exercise and may, in fact, require a hard look at site architecture and URL naming conventions. That might not be a bad thing in some cases. And I have implemented this enough times to see the tremendous value it can bring to an organization.

I know I covered a lot of ground so please let me know if you have any questions.

2011 Predictions

December 31 2010 // Analytics + Marketing + SEO + Social Media + Technology + Web Design // 3 Comments

Okay, I actually don’t have any precognitive ability but I might as well have some fun while predicting events in 2011. Lets look into the crystal ball.

2011 Search Internet Technology Predictions

Facebook becomes a search engine

The Open Graph is just another type of index. Instead of crawling the web like Google, Facebook lets users do it for them. Facebook is creating a massive graph of data and at some point they’ll go all Klingon on Google and uncloak with several bird of prey surrounding search. Game on.

Google buys Foursquare

Unless you’ve been under a rock for the last 6 months it’s clear that Google wants to own local. They’re dedicating a ton of resources to Places and decided that getting citations from others was nice but generating your own reviews would be better. With location based services just catching on with the mainstream, Google will overpay for Foursquare and bring check-ins to the masses.

UX becomes more experiential

Technology (CSS3, Compass, HTML5, jQuery, Flash, AJAX and various noSQL databases to name a few) transforms how users experience the web. Sites that allow users to seamlessly understand applications through interactions will be enormously successful.

Google introduces more SEO tools

Google Webmaster Tools continues to launch tools that will help people understand their search engine optimization efforts. Just like they did with Analytics, Google will work hard in 2011 to commoditize SEO tools.

Identity becomes important

As the traditional link graph becomes increasingly obsolete, Google seeks to leverage social mentions and links. But to do so (in any major way) without opening a whole new front of spam, they’ll work on defining reputation. This will inevitably lead them to identity and the possible acquisition of Rapleaf.

Internet congestion increases

Internet congestion will increase as more and more data is pushed through the pipe. Apps and browser add-ons that attempt to determine the current congestion will become popular and the Internati will embrace this as their version of Greening the web. (Look for a Robert Scoble PSA soon.)

Micropayments battle paywalls

As the appetite for news and digital content continues to swell, a start-up will pitch publications on a micropayment solution (pay per pageview perhaps) as an alternative to subscription paywalls. The start-up may be new or may be one with a large installed user base that hasn’t solved revenue. Or maybe someone like Tynt? I’m crossing my fingers that it’s whoever winds up with Delicious.

Gaming jumps the shark

This is probably more of a hope than a real prediction. I’d love to see people dedicate more time to something (anything!) other than the ‘push-button-receive-pellet’ games. I’m hopeful that people do finally burn out, that the part of the cortex that responds to this type of gratification finally becomes inured to this activity.

Curation is king

The old saw is content is king. But in 2011 curation will be king. Whether it’s something like Fever, my6sense or Blekko, the idea of transforming noise into signal (via algorithm and/or human editing) will be in high demand, as will different ways to present that signal such as Flipboard and Paper.li.

Retargeting wins

What people do will outweigh what people say as retargeting is both more effective for advertisers and more relevant for consumers. Privacy advocates will howl and ally themselves with the government. This action will backfire as the idea of government oversight is more distasteful than that of corporations.

Github becomes self aware

Seriously, have you looked at what is going on at Github? There’s a lot of amazing work being done. So much so that Github will assemble itself Voltron style and become a benevolently self-aware organism that will be our digital sentry protecting us from Skynet.

Google Split Testing Tool

December 23 2010 // Analytics + SEO // Comment

In November Matt Cutts asked ‘What would you do if you were CEO of Google?‘ He was essentially asking readers for a wish list of big ideas. I submitted a few but actually forgot what would be at the top of my list.

Google Christmas

Google A/B Testing

Google does bucket testing all the time. Bucket testing is just another (funnier) word for split testing or A/B testing.

A/B testing, split testing or bucket testing is a method of marketing testing by which a baseline control sample is compared to a variety of single-variable test samples in order to improve response rates. A classic direct mail tactic, this method has been recently adopted within the interactive space to test tactics such as banner ads, emails and landing pages.

Google provides this functionality through paid search via AdWords. Any reputable PPC marketer knows that copy testing is critical to the success of a paid search campaign.

SERP Split Testing Tool

Why not have split testing for SEO? I want to be able to test different versions of my Title and Meta Description for natural search. Does a call to action in my meta description increase click-through rate (CTR)? Does having my site or brand in my Title really make a difference?

As search marketers we know the value of copy testing. And Google should want this as well. Wouldn’t a higher CTR (without an increase in pogosticking) be an indication of a better user experience? Over time wouldn’t iterative copy testing result in higher quality SERPs.

Google could even ride shotgun and learn more about user behavior. If you need a new buzz word to get it off the ground, try crowd sourced bucket testing on for size.

This new testing tool can live within Google Webmaster Central Tools and Google should be able to limit the number of outside variables by ensuring the test is only served on one data cluster. For extra credit Google could even calculate the statistical relevance of the results. Maybe you partner with (or purchase) someone like Optimizely to make it happen.

If this tool is on your Christmas list, please Tweet this post.

SEO Metrics Dashboard

December 20 2010 // Analytics + SEO // 6 Comments

There are plenty of SEO metrics staring you right in your face as the folks at SEOmoz recently pointed out.

SEO Metrics Dashboard

I’ll quickly review the SEO metrics I’ve tracked and used for years. Combined they make a decent SEO metrics dashboard.

SEO Visits

Okay, turn in your SEO credentials if you’re not tracking this. Google Analytics makes it easy with their built in Non-paid Search Traffic default advanced segment.

Non-paid Search Traffic Segment

However, be careful to measure by the week when using this advanced segment. A longer time frame can often lead to sampling. You do not want to see this. It’s the Google Analytics version of the Whammy.

Sampled Data Whammy

Alternatively, you can avoid the default advanced segment and instead navigate to All Traffic -> Search Engines (Non-Paid) or drill down under All Traffic Sources to Medium -> Organic. Beware, you still might run into the sampling whammy if you’re looking at longer time frames.

SEO Landing Pages

Using Google Analytics, use the drop down menu to determine how many landing pages drove SEO traffic by week.

SEO Metrics

I’m less concerned with the actual pages then simply knowing the raw number of pages that brought SEO traffic to the site in a given week.

SEO Keywords

Similarly, using the Google Analytics drop down menu, you can determine how many keywords drove SEO traffic by week.

SEO Metrics

Again, the actual keywords are less important to me (at this point) than the weekly volume.

Indexed Pages

Each week I also capture the number of indexed pages. I used to do this using the site: operator but have been using Google Webmaster Tools for quite a while since it seems more accurate and stable.

If you go the Webmaster Tools route, make certain that you have your sitemap(s) submitted correctly since duplicate sitemaps can often lead to inflated indexation numbers.

Calculated Fields

With those four pieces of data I create five calculated metrics.

  • Visits/Keywords
  • Visits/Landing Pages
  • Keywords/Landing Pages
  • Visits/Indexed Pages
  • Landing Pages/Indexed Pages

These calculated metrics are where I find the most benefit. While I do track them separately, analysis can only be performed by looking at how these metrics interact with each other. Let me say it again, do not look at these metrics in isolation.

SEO Metrics

Inevitably I get asked, is such-and-such a number a good Visits/Landing Pages number? The thing is there are no good or bad numbers (within reason). The idea is to measure (and improve) the performance of these metrics over time and to use them to diagnose changes in SEO traffic.

Visits/Keywords

This metric can often provide insight into how well you’re ranking. When it goes up, your overall rank may be rising. However, it could also be influenced by seasonal search volume. For example, if you were analyzing a site that provided tax advice, I’d guess that the Visits/Keywords metric would go up during April due to the increased volume for tax terms.

Remember, these metrics are high level indicators. They’re a warning system. When one of the indicators changes, you investigate to determine the reason the metric changed. Did you get more visits or did you receive the same traffic from fewer keywords? Find out and then act accordingly.

Visits/Landing Pages

The Visits/Landing Pages metric usually tells me how effective an average page is at attracting SEO traffic. Again, look under the covers before you make any hasty decisions. An increase in this metric could be the product of fewer landing pages. That could be a bad sign, not a good one.

In particular, look at how Visits/Keywords and Visits/Landing Pages interact.

Keywords/Landing Pages

I use this metric to track keyword clustering. This is particularly nice if you’re launching a new set of content. Once published and indexed you often see the Keywords/Landing Pages metric go down. New pages may not attract a lot of traffic immediately and the ones that do often only bring in traffic from a select keyword.

However, as these pages mature they begin to bring in more traffic; first from just a select group of keywords and then (if things are going well) you’ll find they begin to bring in traffic from a larger group of keywords. That is keyword clustering and it’s one of the ways I forecast SEO traffic.

Visits/Indexed Pages

I like to track this metric as a general SEO health metric. It tells me about SEO efficiency. Again, there is no real right or wrong number here. A site with fewer pages, but ranking well for a high volume term may have a very high Visits/Indexed Pages metric. A site with a lot of pages (which is where I do most of my work) may be working the long-tail and will have a lower Visits/Indexed Pages number.

The idea is to track and monitor the metric over time. If you’re launching a whole new category for an eCommerce site, those pages may get indexed quickly but not generate the requisite visits right off the bat. Whether the Visits/Indexed Pages metric bounces back as those new pages mature is what I focus on.

Landing Pages/Indexed Pages

This metric gives you an idea of what percentage of your indexed pages are driving traffic each week. This is another efficiency metric. Sometimes this leads me to investigate which pages are working and which aren’t. Is there a crawl issue? Is there an architecture issue?  It can often lead to larger discussions about what a site is focused on where it should dedicate resources.

Measure Percentage Change

Once you plug in all of these numbers and generate the calculated metrics you might look at the numbers and think they’re not moving much. Indeed, from a raw number perspective they sometimes don’t move that much. That’s why you must look at it by percentage change.

SEO Metrics by Percentage Change

For instance, for a large site moving the Visits/Keyword metric from 3.2 to 3.9 may not look like a lot. But it’s actually a 22% increase! And when your SEO traffic changes you can immediately look at the percentage change numbers to see what metric moved the most.

To easily measure the percentage change I recommend creating another tab in your spreadsheet and making that your percentage change view. So you wind up having a raw number tab and a percentage change tab.

SEO Metrics Analysis

I’m going to do a quick analysis looking back at some of this historical data. In particular I’m going to look at the SEO traffic increase between 3/23/08 and 3/30/08.

SEO Metric Analysis

That’s a healthy jump in SEO traffic. Let there be much rejoicing! To quickly find out what exactly drove that increase I’ll switch to the percentage change view of these metrics.

SEO Metrics Analysis

In this view you quickly see that the 33% increase in SEO traffic was driven almost exclusively by a 28% increase in Keywords. This was an instance where keyword clustering took effect and pages began receiving traffic for more (related) query terms. Look closely and you’ll notice that this increase occurred despite a decrease of 2% in number of Landing Pages.

Of course the next step would be to determine if certain pages or keyword modifiers were most responsible for this increase. Find the pattern and you have a shot at repeating it.

Graph Your SEO Metrics

If you’re more visual in nature create a third tab and generate a graph for each metric. Put them all on the same page so you can see them together. This comprehensive trend view can often bring issues to the surface quickly. Plus … it just looks cool.

Add a Filter

If you’re feeling up to it you can create the same dashboard based on a filter. The most common filter would be conversion. To do so you build an Advanced Segment in Google Analytics that looks for any SEO traffic with a conversion. Apply that segment, repeat the Visits, Landing Pages and Keywords numbers and then generate new calculated metrics.

At that point you’re looking at these metrics through a performance filter.

The End is the Beginning

Circular Google Logo

This SEO metrics dashboard is just the tip of the iceberg. Creating detailed crawl and traffic reports will be necessary. But if you start with the metrics outlined above, they should lead you to the right reports. Because the questions they’ll raise can only be answered by doing more due diligence.

Bounce Rate vs Exit Rate

November 15 2010 // Analytics + SEO // 19 Comments

One of the most common Google Analytics questions I get is to explain the difference between bounce rate and exit rate. Here’s what I hope is a simple explanation.

Bounce Rate

Bounce Rate

Bounce rate is the percentage of people who landed on a page and immediately left. Bounces are always one page sessions.

High bounce rates are often bad, but it’s really a matter of context. Some queries may inherently generate high bounce rates. Specific informational queries (e.g. – What are the flavors of Otter Pops?) might yield high bounce rates. If the page fulfills the query intent, there may be no further reason for the user to engage. It doesn’t mean it was a bad experience, it just means they got exactly what they wanted and nothing more. (I was always partial to Louie-Bloo Raspberry or Alexander the Grape.)

A high bounce rate on a home page is usually a sign that something is wrong. But again, make sure you take a close look at the sources and keywords that are driving traffic. You might have a very low bounce rate for some keywords and very high for others. Maybe you’re getting a lot of StumbleUpon traffic which, by its very nature, has a high bounce rate.

Bounce rate is important but always make sure you look beyond the actual number.

Exit Rate

Exit Rate

Exit rate is the percentage of people who left your site from that page. Exits may have viewed more than one page in a session. That means they may not have landed on that page, but simply found their way to it through site navigation.

Like bounce rates, high exit rates can often reveal problem areas on your site. But the same type of caution needs to be applied. If you have a paginated article – say four pages – and the exit rate on the last page is high, is that really a bad thing? They’ve reached the end of the article. It may be natural for them to leave at that point.

Of course, you’ll want to try different UX treatments for surfacing related articles or encourage social interactions to reduce the exit rate, but that it was high to begin with shouldn’t create panic.

Exit rate should be looked at within a relative navigation context. Pages that should naturally create further clicks, but don’t, are ripe for optimization.

(Extra points if you get my visual ‘bounce’ reference.)

But There’s More! I’ve developed the Ultimate Guide to Bounce Rate to answer all of your bounce rate questions. This straight-forward guide features Ron Paul, The Rolling Stones and Nyan Cat. You’re sure to learn something and be entertained at the same time.

How To Get 100 Likes From 2 People

November 08 2010 // Analytics + Social Media // 9 Comments

The other day I wrote about the potential for inflated Like numbers. In particular, I was interested in how comments were factored into the Like total.  It was pretty clear that Likes and comments were not mutually exclusive. But were comments a count of unique contributors or simply a total count of comments.

The Like Experiment

So, I ran a small experiment using an old satirical blog post: LOLCats and Religion: A Dissertation.

This post originally had two shares but no Likes or comments. So I went ahead and Liked it and asked my colleague Jeremy Post to have a comment dialog on the item. In all, we generated 10 comments.

Facebook Comments

One of my concerns was that comments might not always relate to the item and interestingly enough we actually did switch topics during the dialog from LOLCats to Dune. Go figure. (Note to self - fix image being attributed to blog posts.)

The Like Results

So what was the result? How many Likes did this old post rack up due to this comment stream? Sure enough, every comment is counted as a Like.

Facebook Like Numbers

A quick check using my Facebook Like Number Bookmarklet reveals how the number is calculated.

Facebook Like Count

So, did 13 others like this? No, it’s just two people having a conversation on a shared item. And that’s how you could get …

100 Likes from 2 People on 1 Item

Don’t Average CTR

November 08 2010 // Analytics + PPC + Rant + SEO // 2 Comments

One of the biggest errors I see (consistently) in SEO and PPC analysis is using Excel’s AVERAGE function on Click Through Rate (CTR). As I mentioned in my SEO Pivot Tables post, do not do this. Here’s why averaging CTR is dangerous.

Take the following set of 10 data points.

Don't Average Click Through Rate

If you SUM all of the Impressions and Clicks and then do the CTR calculation you arrive at 10.05%. If you AVERAGE the 10 CTR percentages you arrive at 6.14%.

If I change the Clicks for these 10 data points I can produce the opposite effect.

Don't Average CTR

And will you look at that, the average CTR is the same in both instances. Can you see how misleading average can be here?

Don’t Average Click Through Rate

For years, I’ve used a structured Excel quiz in my hiring process that tests just this issue. In my experience upwards of 50% of applicants fail the quiz. If you’re pulling down data into Excel for PPC or SEO, make sure you don’t fall into this trap.

Facebook Like Number Bookmarklets

November 05 2010 // Analytics + SEO + Social Media // 2 Comments

Want to know the Facebook Like statistics for the page you’re on? No problem.

Facebook Like Number Bookmarklets

Using the old REST API you can find out the Facebook Like statistics for any page. For easy access, simply drag these two links to your bookmark bar.

FB Stats: Current Page

FB Stats: Home Page

The Current Page bookmarklet will provide Like statistics for the page you’re on. So, if you were on the ReadWriteWeb article about Facebook Places Deals you can click on this bookmarklet and be provided with the Like statistics for that page.

Facebook Like Bookmarklet

The Home Page bookmarklet will provide Like statistics for the home page for the site you’re on. Please note that this is not showing the aggregate Like statistics for the entire site, but just that of the home page.

Like Number Use Cases

Why are these bookmarklets useful apart from abject curiosity?

First off, you can determine the true number of Likes. Second, they provide competitive intelligence and potential insight into Facebook’s search algorithm (aka Facebook SEO). Do pages with a higher distribution of comments get a higher weight? I’m not sure.

This is one way to begin understanding the ways in which pages enter the Open Graph and how they are treated based on Like activity.