You Are Browsing The Analytics Category

Optimize Your Sitemap Index

January 11 2011 // Analytics + SEO // 13 Comments

Information is power. It's no different in the world of SEO. So here's an interesting way to get more information on indexation by optimizing your sitemap index file.

What is a Sitemap Index?

A sitemap index file is simply a group of individual sitemaps, using an XML format similar to a regular sitemap file.

You can provide multiple Sitemap files, but each Sitemap file that you provide must have no more than 50,000 URLs and must be no larger than 10MB (10,485,760 bytes). [...] If you want to list more than 50,000 URLs, you must create multiple Sitemap files.

If you do provide multiple Sitemaps, you should then list each Sitemap file in a Sitemap index file.

Most sites begin using a sitemap index file out of necessity when they bump up against the 50,000 URL limit for a sitemap. Don't tune out if you don't have that many URLs. You can still use a sitemap index to your benefit.

Googling a Sitemap Index

I'm going to search for a sitemap index to use as an example. To do so I'm going to use the inurl: and site: operators in conjunction.

Google a Sitemap Index

Best Buy was top of mind since I recently bought a TV there and I have a Reward Zone credit I need to use. The sitemap index wasn't difficult to find in this case. However, they don't have to be named as such. So if you're doing some competitive research you may need to poke around a bit to find the sitemap index and then validate that it's the correct one.

Opening a Sitemap Index

You can then click on the result and see the individual sitemaps.

Inspect Sitemap Index

Here's what the sitemap index looks like. A listing of each individual sitemap. In this case there are 15 of them, all sequentially numbered.

Looking at a Sitemap

The sitemaps are compressed using gzip so you'll need to extract them to look at an individual sitemap. Copy the URL into your browser bar and the rest should take care of itself. Fire up your favorite text program and you're looking at the individual URLs that comprise that sitemap.

Best Buy Sitemap Example

So within one of these sitemaps I quickly find that there are URLs that go to a TV a Digital Camera and a Video Game. They are all product pages but there doesn't seem to be any grouping by category. This is standard, but it's not what I'd call optimized.

Sitemap Index Metrics

Within Google Webmaster tools you'll be able to see the number of URLs submitted and the number indexed by sitemap

Here's an example (not Best Buy) of sitemap index reporting in Google Webmaster tools.

Sitemap Index Metric Sample

So in the case of the Best Buy sitemap index, they'd be able to drill down and know the indexation rate for each of their 15 sitemaps.

What if you created those sitemaps with a goal in mind?

Sitemap Index Optimization

Instead using some sequential process and having products from multiple categories in an individual sitemap, what if you created a sitemap specifically for each product type?

  • sitemap.tv.xml
  • sitemap.digital-cameras.xml
  • sitemap.video-games.xml

In the case of video games you might need multiple sitemaps if the URL count exceeds 50,000. No problem.

  • sitemap.video-games-1.xml
  • sitemap.video-games-2.xml

Now, you'd likely have more than 15 sitemaps at this point but the level of detail you suddenly get on indexation is dramatic. You could instantly find that TVs were indexed at a 95% rate while video games were indexed at a 56% rate. This is information you can use and act on.

It doesn't have to be one dimensional either, you can pack a lot of information into individual sitemaps. For instance, maybe Best Buy would like to know the indexation rate by product type and page type. By this I mean, would Best Buy want to know the indexation rate of category pages (lists of products) versus product pages (an individual product page.)

To do so would be relatively straight forward. Just split each product type into separate page type sitemaps.

  • sitemap.tv.category.xml
  • sitemap.tv.product.xml
  • sitemap.digital-camera.category.xml
  • sitemap.digital-camera.product.xml

And so on and so forth. Grab the results from Webmaster Tools and drop them into Excel and in no time you'll be able to slice and dice the indexation rates to answer the following questions. What's the indexation rate for category pages versus product pages? What's the indexation rate by product type?

You can get pretty granular if you want though you can only pack each sitemap index with 50,000 sitemaps. Then again, you're not limited to just one sitemap index either!

In addition, you don't need 50,000 URLs to use a sitemap index. Each sitemap could contain a small amount of URLs, so don't pass on this type of optimization thinking it's just for big sites.

Connecting the Dots

Knowing the indexation rate for each 'type' of content gives you an interesting view into what Google thinks of specific pages and content. The two other pieces of the puzzle are what happens before (crawl) and after (traffic). Both of these can be solved.

Crawl tracking can done by mining weblogs for Googlebot (and Bingbot) by the same sitemap criteria. So, not only do I know how much bots are crawling each day I know where they're crawling. As you make SEO changes, you are then able to see how it impacts the crawl and follow it through to indexation.

The last step is mapping it to traffic. This can be done by creating Google Analytics Advanced Segments that match the sitemaps using regular expressions. (RegEx is your friend.) With that in place, you can track changes in the crawl to changes in indexation to changes in traffic. Nirvana!

Go to the Moon

Doing this is often not an easy exercise and may, in fact, require a hard look at site architecture and URL naming conventions. That might not be a bad thing in some cases. And I have implemented this enough times to see the tremendous value it can bring to an organization.

I know I covered a lot of ground so please let me know if you have any questions.

2011 Predictions

December 31 2010 // Analytics + Marketing + SEO + Social Media + Technology + Web Design // 3 Comments

Okay, I actually don't have any precognitive ability but I might as well have some fun while predicting events in 2011. Lets look into the crystal ball.

2011 Search Internet Technology Predictions

Facebook becomes a search engine

The Open Graph is just another type of index. Instead of crawling the web like Google, Facebook lets users do it for them. Facebook is creating a massive graph of data and at some point they'll go all Klingon on Google and uncloak with several bird of prey surrounding search. Game on.

Google buys Foursquare

Unless you've been under a rock for the last 6 months it's clear that Google wants to own local. They're dedicating a ton of resources to Places and decided that getting citations from others was nice but generating your own reviews would be better. With location based services just catching on with the mainstream, Google will overpay for Foursquare and bring check-ins to the masses.

UX becomes more experiential

Technology (CSS3, Compass, HTML5, jQuery, Flash, AJAX and various noSQL databases to name a few) transforms how users experience the web. Sites that allow users to seamlessly understand applications through interactions will be enormously successful.

Google introduces more SEO tools

Google Webmaster Tools continues to launch tools that will help people understand their search engine optimization efforts. Just like they did with Analytics, Google will work hard in 2011 to commoditize SEO tools.

Identity becomes important

As the traditional link graph becomes increasingly obsolete, Google seeks to leverage social mentions and links. But to do so (in any major way) without opening a whole new front of spam, they'll work on defining reputation. This will inevitably lead them to identity and the possible acquisition of Rapleaf.

Internet congestion increases

Internet congestion will increase as more and more data is pushed through the pipe. Apps and browser add-ons that attempt to determine the current congestion will become popular and the Internati will embrace this as their version of Greening the web. (Look for a Robert Scoble PSA soon.)

Micropayments battle paywalls

As the appetite for news and digital content continues to swell, a start-up will pitch publications on a micropayment solution (pay per pageview perhaps) as an alternative to subscription paywalls. The start-up may be new or may be one with a large installed user base that hasn't solved revenue. Or maybe someone like Tynt? I'm crossing my fingers that it's whoever winds up with Delicious.

Gaming jumps the shark

This is probably more of a hope than a real prediction. I'd love to see people dedicate more time to something (anything!) other than the 'push-button-receive-pellet' games. I'm hopeful that people do finally burn out, that the part of the cortex that responds to this type of gratification finally becomes inured to this activity.

Curation is king

The old saw is content is king. But in 2011 curation will be king. Whether it's something like Fever, my6sense or Blekko, the idea of transforming noise into signal (via algorithm and/or human editing) will be in high demand, as will different ways to present that signal such as Flipboard and Paper.li.

Retargeting wins

What people do will outweigh what people say as retargeting is both more effective for advertisers and more relevant for consumers. Privacy advocates will howl and ally themselves with the government. This action will backfire as the idea of government oversight is more distasteful than that of corporations.

Github becomes self aware

Seriously, have you looked at what is going on at Github? There's a lot of amazing work being done. So much so that Github will assemble itself Voltron style and become a benevolently self-aware organism that will be our digital sentry protecting us from Skynet.

Google Split Testing Tool

December 23 2010 // Analytics + SEO // Comment

In November Matt Cutts asked 'What would you do if you were CEO of Google?' He was essentially asking readers for a wish list of big ideas. I submitted a few but actually forgot what would be at the top of my list.

Google Christmas

Google A/B Testing

Google does bucket testing all the time. Bucket testing is just another (funnier) word for split testing or A/B testing.

A/B testing, split testing or bucket testing is a method of marketing testing by which a baseline control sample is compared to a variety of single-variable test samples in order to improve response rates. A classic direct mail tactic, this method has been recently adopted within the interactive space to test tactics such as banner ads, emails and landing pages.

Google provides this functionality through paid search via AdWords. Any reputable PPC marketer knows that copy testing is critical to the success of a paid search campaign.

SERP Split Testing Tool

Why not have split testing for SEO? I want to be able to test different versions of my Title and Meta Description for natural search. Does a call to action in my meta description increase click-through rate (CTR)? Does having my site or brand in my Title really make a difference?

As search marketers we know the value of copy testing. And Google should want this as well. Wouldn't a higher CTR (without an increase in pogosticking) be an indication of a better user experience? Over time wouldn't iterative copy testing result in higher quality SERPs.

Google could even ride shotgun and learn more about user behavior. If you need a new buzz word to get it off the ground, try crowd sourced bucket testing on for size.

This new testing tool can live within Google Webmaster Central Tools and Google should be able to limit the number of outside variables by ensuring the test is only served on one data cluster. For extra credit Google could even calculate the statistical relevance of the results. Maybe you partner with (or purchase) someone like Optimizely to make it happen.

If this tool is on your Christmas list, please Tweet this post.

SEO Metrics Dashboard

December 20 2010 // Analytics + SEO // 6 Comments

There are plenty of SEO metrics staring you right in your face as the folks at SEOmoz recently pointed out.

SEO Metrics Dashboard

I'll quickly review the SEO metrics I've tracked and used for years. Combined they make a decent SEO metrics dashboard.

SEO Visits

Okay, turn in your SEO credentials if you're not tracking this. Google Analytics makes it easy with their built in Non-paid Search Traffic default advanced segment.

Non-paid Search Traffic Segment

However, be careful to measure by the week when using this advanced segment. A longer time frame can often lead to sampling. You do not want to see this. It's the Google Analytics version of the Whammy.

Sampled Data Whammy

Alternatively, you can avoid the default advanced segment and instead navigate to All Traffic -> Search Engines (Non-Paid) or drill down under All Traffic Sources to Medium -> Organic. Beware, you still might run into the sampling whammy if you're looking at longer time frames.

SEO Landing Pages

Using Google Analytics, use the drop down menu to determine how many landing pages drove SEO traffic by week.

SEO Metrics

I'm less concerned with the actual pages then simply knowing the raw number of pages that brought SEO traffic to the site in a given week.

SEO Keywords

Similarly, using the Google Analytics drop down menu, you can determine how many keywords drove SEO traffic by week.

SEO Metrics

Again, the actual keywords are less important to me (at this point) than the weekly volume.

Indexed Pages

Each week I also capture the number of indexed pages. I used to do this using the site: operator but have been using Google Webmaster Tools for quite a while since it seems more accurate and stable.

If you go the Webmaster Tools route, make certain that you have your sitemap(s) submitted correctly since duplicate sitemaps can often lead to inflated indexation numbers.

Calculated Fields

With those four pieces of data I create five calculated metrics.

  • Visits/Keywords
  • Visits/Landing Pages
  • Keywords/Landing Pages
  • Visits/Indexed Pages
  • Landing Pages/Indexed Pages

These calculated metrics are where I find the most benefit. While I do track them separately, analysis can only be performed by looking at how these metrics interact with each other. Let me say it again, do not look at these metrics in isolation.

SEO Metrics

Inevitably I get asked, is such-and-such a number a good Visits/Landing Pages number? The thing is there are no good or bad numbers (within reason). The idea is to measure (and improve) the performance of these metrics over time and to use them to diagnose changes in SEO traffic.

Visits/Keywords

This metric can often provide insight into how well you're ranking. When it goes up, your overall rank may be rising. However, it could also be influenced by seasonal search volume. For example, if you were analyzing a site that provided tax advice, I'd guess that the Visits/Keywords metric would go up during April due to the increased volume for tax terms.

Remember, these metrics are high level indicators. They're a warning system. When one of the indicators changes, you investigate to determine the reason the metric changed. Did you get more visits or did you receive the same traffic from fewer keywords? Find out and then act accordingly.

Visits/Landing Pages

The Visits/Landing Pages metric usually tells me how effective an average page is at attracting SEO traffic. Again, look under the covers before you make any hasty decisions. An increase in this metric could be the product of fewer landing pages. That could be a bad sign, not a good one.

In particular, look at how Visits/Keywords and Visits/Landing Pages interact.

Keywords/Landing Pages

I use this metric to track keyword clustering. This is particularly nice if you're launching a new set of content. Once published and indexed you often see the Keywords/Landing Pages metric go down. New pages may not attract a lot of traffic immediately and the ones that do often only bring in traffic from a select keyword.

However, as these pages mature they begin to bring in more traffic; first from just a select group of keywords and then (if things are going well) you'll find they begin to bring in traffic from a larger group of keywords. That is keyword clustering and it's one of the ways I forecast SEO traffic.

Visits/Indexed Pages

I like to track this metric as a general SEO health metric. It tells me about SEO efficiency. Again, there is no real right or wrong number here. A site with fewer pages, but ranking well for a high volume term may have a very high Visits/Indexed Pages metric. A site with a lot of pages (which is where I do most of my work) may be working the long-tail and will have a lower Visits/Indexed Pages number.

The idea is to track and monitor the metric over time. If you're launching a whole new category for an eCommerce site, those pages may get indexed quickly but not generate the requisite visits right off the bat. Whether the Visits/Indexed Pages metric bounces back as those new pages mature is what I focus on.

Landing Pages/Indexed Pages

This metric gives you an idea of what percentage of your indexed pages are driving traffic each week. This is another efficiency metric. Sometimes this leads me to investigate which pages are working and which aren't. Is there a crawl issue? Is there an architecture issue?  It can often lead to larger discussions about what a site is focused on where it should dedicate resources.

Measure Percentage Change

Once you plug in all of these numbers and generate the calculated metrics you might look at the numbers and think they're not moving much. Indeed, from a raw number perspective they sometimes don't move that much. That's why you must look at it by percentage change.

SEO Metrics by Percentage Change

For instance, for a large site moving the Visits/Keyword metric from 3.2 to 3.9 may not look like a lot. But it's actually a 22% increase! And when your SEO traffic changes you can immediately look at the percentage change numbers to see what metric moved the most.

To easily measure the percentage change I recommend creating another tab in your spreadsheet and making that your percentage change view. So you wind up having a raw number tab and a percentage change tab.

SEO Metrics Analysis

I'm going to do a quick analysis looking back at some of this historical data. In particular I'm going to look at the SEO traffic increase between 3/23/08 and 3/30/08.

SEO Metric Analysis

That's a healthy jump in SEO traffic. Let there be much rejoicing! To quickly find out what exactly drove that increase I'll switch to the percentage change view of these metrics.

SEO Metrics Analysis

In this view you quickly see that the 33% increase in SEO traffic was driven almost exclusively by a 28% increase in Keywords. This was an instance where keyword clustering took effect and pages began receiving traffic for more (related) query terms. Look closely and you'll notice that this increase occurred despite a decrease of 2% in number of Landing Pages.

Of course the next step would be to determine if certain pages or keyword modifiers were most responsible for this increase. Find the pattern and you have a shot at repeating it.

Graph Your SEO Metrics

If you're more visual in nature create a third tab and generate a graph for each metric. Put them all on the same page so you can see them together. This comprehensive trend view can often bring issues to the surface quickly. Plus ... it just looks cool.

Add a Filter

If you're feeling up to it you can create the same dashboard based on a filter. The most common filter would be conversion. To do so you build an Advanced Segment in Google Analytics that looks for any SEO traffic with a conversion. Apply that segment, repeat the Visits, Landing Pages and Keywords numbers and then generate new calculated metrics.

At that point you're looking at these metrics through a performance filter.

The End is the Beginning

Circular Google Logo

This SEO metrics dashboard is just the tip of the iceberg. Creating detailed crawl and traffic reports will be necessary. But if you start with the metrics outlined above, they should lead you to the right reports. Because the questions they'll raise can only be answered by doing more due diligence.

Bounce Rate vs Exit Rate

November 15 2010 // Analytics + SEO // 19 Comments

One of the most common Google Analytics questions I get is to explain the difference between bounce rate and exit rate. Here's what I hope is a simple explanation.

Bounce Rate

Bounce Rate

Bounce rate is the percentage of people who landed on a page and immediately left. Bounces are always one page sessions.

High bounce rates are often bad, but it's really a matter of context. Some queries may inherently generate high bounce rates. Specific informational queries (e.g. - What are the flavors of Otter Pops?) might yield high bounce rates. If the page fulfills the query intent, there may be no further reason for the user to engage. It doesn't mean it was a bad experience, it just means they got exactly what they wanted and nothing more. (I was always partial to Louie-Bloo Raspberry or Alexander the Grape.)

A high bounce rate on a home page is usually a sign that something is wrong. But again, make sure you take a close look at the sources and keywords that are driving traffic. You might have a very low bounce rate for some keywords and very high for others. Maybe you're getting a lot of StumbleUpon traffic which, by its very nature, has a high bounce rate.

Bounce rate is important but always make sure you look beyond the actual number.

Exit Rate

Exit Rate

Exit rate is the percentage of people who left your site from that page. Exits may have viewed more than one page in a session. That means they may not have landed on that page, but simply found their way to it through site navigation.

Like bounce rates, high exit rates can often reveal problem areas on your site. But the same type of caution needs to be applied. If you have a paginated article - say four pages - and the exit rate on the last page is high, is that really a bad thing? They've reached the end of the article. It may be natural for them to leave at that point.

Of course, you'll want to try different UX treatments for surfacing related articles or encourage social interactions to reduce the exit rate, but that it was high to begin with shouldn't create panic.

Exit rate should be looked at within a relative navigation context. Pages that should naturally create further clicks, but don't, are ripe for optimization.

(Extra points if you get my visual 'bounce' reference.)

But There's More! I've developed the Ultimate Guide to Bounce Rate to answer all of your bounce rate questions. This straight-forward guide features Ron Paul, The Rolling Stones and Nyan Cat. You're sure to learn something and be entertained at the same time.

How To Get 100 Likes From 2 People

November 08 2010 // Analytics + Social Media // 9 Comments

The other day I wrote about the potential for inflated Like numbers. In particular, I was interested in how comments were factored into the Like total.  It was pretty clear that Likes and comments were not mutually exclusive. But were comments a count of unique contributors or simply a total count of comments.

The Like Experiment

So, I ran a small experiment using an old satirical blog post: LOLCats and Religion: A Dissertation.

This post originally had two shares but no Likes or comments. So I went ahead and Liked it and asked my colleague Jeremy Post to have a comment dialog on the item. In all, we generated 10 comments.

Facebook Comments

One of my concerns was that comments might not always relate to the item and interestingly enough we actually did switch topics during the dialog from LOLCats to Dune. Go figure. (Note to self - fix image being attributed to blog posts.)

The Like Results

So what was the result? How many Likes did this old post rack up due to this comment stream? Sure enough, every comment is counted as a Like.

Facebook Like Numbers

A quick check using my Facebook Like Number Bookmarklet reveals how the number is calculated.

Facebook Like Count

So, did 13 others like this? No, it's just two people having a conversation on a shared item. And that's how you could get ...

100 Likes from 2 People on 1 Item

Don’t Average CTR

November 08 2010 // Analytics + PPC + Rant + SEO // 2 Comments

One of the biggest errors I see (consistently) in SEO and PPC analysis is using Excel's AVERAGE function on Click Through Rate (CTR). As I mentioned in my SEO Pivot Tables post, do not do this. Here's why averaging CTR is dangerous.

Take the following set of 10 data points.

Don't Average Click Through Rate

If you SUM all of the Impressions and Clicks and then do the CTR calculation you arrive at 10.05%. If you AVERAGE the 10 CTR percentages you arrive at 6.14%.

If I change the Clicks for these 10 data points I can produce the opposite effect.

Don't Average CTR

And will you look at that, the average CTR is the same in both instances. Can you see how misleading average can be here?

Don't Average Click Through Rate

For years, I've used a structured Excel quiz in my hiring process that tests just this issue. In my experience upwards of 50% of applicants fail the quiz. If you're pulling down data into Excel for PPC or SEO, make sure you don't fall into this trap.

Facebook Like Number Bookmarklets

November 05 2010 // Analytics + SEO + Social Media // 2 Comments

Want to know the Facebook Like statistics for the page you're on? No problem.

Facebook Like Number Bookmarklets

Using the old REST API you can find out the Facebook Like statistics for any page. For easy access, simply drag these two links to your bookmark bar.

FB Stats: Current Page

FB Stats: Home Page

The Current Page bookmarklet will provide Like statistics for the page you're on. So, if you were on the ReadWriteWeb article about Facebook Places Deals you can click on this bookmarklet and be provided with the Like statistics for that page.

Facebook Like Bookmarklet

The Home Page bookmarklet will provide Like statistics for the home page for the site you're on. Please note that this is not showing the aggregate Like statistics for the entire site, but just that of the home page.

Like Number Use Cases

Why are these bookmarklets useful apart from abject curiosity?

First off, you can determine the true number of Likes. Second, they provide competitive intelligence and potential insight into Facebook's search algorithm (aka Facebook SEO). Do pages with a higher distribution of comments get a higher weight? I'm not sure.

This is one way to begin understanding the ways in which pages enter the Open Graph and how they are treated based on Like activity.

Impact of Google Instant

October 01 2010 // Analytics + SEO // 2 Comments

Everyone wants to know how Google Instant is changing search.

There's been some great analysis on whether Google Instant has changed keyword length. While auto complete could certainly have an impact on query behavior, the impact so far seems to be negligible.

I've been more interested in whether Google Instant would change assessment behavior. I theorized that Google Instant might result in more clicks above the fold because users would become focused on watching - and assessing - search results as they typed.

Google Instant Rank Analysis

Each week I measure the amount of traffic produced by each rank via Google Analytics. Using this data across two large sites from different verticals, I'm able to compare traffic by rank the week prior to Google Instant's launch versus the most recent week.

Google Instant Traffic by Rank

The distribution of traffic by rank certainly seems different. But there's a good deal of noise in pulling this data.

First, while the volume of searches is high the number of data sources is low.

Second, the number of terms driving traffic at each rank and the query volume for those terms may have changed. (However, a quick analysis shows that the number of terms doesn't have a bearing on the data.)

Lastly, the Google Analytics rank hack only captures a certain percentage of traffic where the 'cd' parameter is passed. Historically that was about 20% to 25% of search results. However, just prior to the launch of Google Instant that percentage shot up to ~40%. The subsequent weeks of decline in 'ranked' traffic don't map to overall traffic patters, so I believe the amount of traffic with a 'cd' parameter has likely decreased.

Long story short, the variation by rank is the signal, not the actual changes in rank.

Google Instant Click Distribution

Has Google Instant changed the distribution of clicks by rank? If above the fold ranks are getting a higher distribution of clicks, SEO is not dead - its become more important than ever.

But what about the odd behavior of those with a rank of 1 or 2? Could paid search or Onebox presentations be sucking away traffic from the first and second positions?

Google Webmaster Tools should be able to provide some additional insight. Yet, as I was performing the analysis I came to realize that the search queries report in Google Webmaster Tools has less coverage (5% to 15%) then the Google Analytics rank hack (20% to 40%).

Furthermore, the coverage in Google Webmaster Tools is weighted by rank, with (far) higher visibility for higher ranks. I'm not sure analysis on lower ranks is reliable given the thin data set. However, I do see appreciable declines in CTR for both the first and second positions. This seems to support the data gleaned from Google Analytics.

Above The Fold SEO

If the distribution of clicks is changing, being above the fold could become increasingly important. Earlier this year Jakob Nielsen conducted a study that showed that users spend 80% of their time and attention above the fold.

Eye Tracking Page Distribution

Lets be clear, the prerequisite is that the user scrolls. Does Google Instant disrupt the natural inclination to scroll? I say yes, and I think the preliminary data points in that direction.

Eye tracking studies have already shown the difference in assessing informational versus transactional queries. I'd like to see these studies performed using Google Instant to determine if those patterns have changed.

Baring that, I'd like to expand my data set and appeal to others who have Google Analytics rank data to perform the same analysis. Do it yourself and post the results or send me the data and I'll aggregate it with my current data set.

Either way, I believe it's important to understand how search behavior may be changing and adapt accordingly.

Google Analytics Default Profile

September 24 2010 // Analytics + SEO // Comment

If you've used Google Analytics for any stretch of time, you probably have a number of different Google Analytics profiles for your website. For instance, you might should be tracking keyword rank in a Google Analytics profile.

Different profiles can be handy but often the one you use the most isn't the default profile. Each time you log-in to Google Analytics it defaults to a profile based on an alphabetic sort. This is annoying and, sadly, Google hasn't launched a new 'select-as-default-profile' feature. Instead, there's a very simple and easy hack.

Google Analytics Default Profile Hack

First, click on Analytics Settings in your Google Analytics account.

Google Analytics Settings

From there, find the profile you want to be the default profile. Next to that profile, on the far right under the Actions column you should see an Edit option.

Edit or Delete Profile

If you don't see these actions, you don't have the rights to make this change. Find someone with Administrator access or have them grant you that access. If you do see these actions, click Edit. Then you'll want to edit the main website profile information. The Edit link is located at the upper right side.

Edit Google Analytics Profile

Now all you have to do is type an underscore at the beginning of your profile name. The example uses a domain entry, but it could just as easily be something like _My Website Name.

Underscore Your Profile

Then click Save and you're done. Don't worry, this will NOT impact the tracking on this profile. No data or history will be lost.

The underscore profile will now be the default profile since it's first alphabetically. It's certainly not the only way to do this, but this 1 minute hack can make your daily use of Google Analytics just a bit easier.