This article was originally published under Search Engine Land.
When optimizing URLs for high rankings, little attention is given to optimizing the URL for maximum clickthrough. Yet the URL undeniably affects searcher clickthrough rates in the SERPs (Search Engine Results Pages), as demonstrated by MarketingSherpa in their eyetracking study published in the 2008 Search Marketing Benchmark Guide.
Specifically, MarketingSherpa found that short URLs get clicked on twice as often as long URLs (given that the position rank is equal). As you can see in the heatmaps below, experiment participants spent more time viewing the long URL, but less time viewing the entire listing. You could conclude from this that the long URL distracts the searcher from viewing the listingโs title and description. Not a great outcome.
Caption: MarketingSherpa eyetracking heatmaps showing impact of long URL length on listing viewing. Used with permission.
Worse yet, long URLs appear to act as a deterrent to clicking, drawing attention away from its listing and instead directing it to the listing below it, which then gets clicked 2.5x more frequently. Itโs open for debate, of course, as to what is a โshortโ URL or a โlongโ URL. But itโs the first data Iโve ever seen attempt to quantify the affinity searchers have for the URL component of natural search listings.
For us, these MarketingSherpa findings confirm that success at SEO still requires more than just Google Sitemaps, and that an unoptimized URL is money left on the table. Just because algorithms have evolved to handle dynamic URLs with multiple parameters, avoid session-based spider traps, and even fill out forms on occasion, we shouldnโt be lulled into a false sense of security that our URLs are โgood enoughโ and donโt need work. You should be on an unending mission to find and execute on opportunities to test and optimize URLs for both rankings and clickthrough.
So even though URLs youโd never have dreamed of getting indexed a few years ago are now regularly making it into the index, this doesnโt mean that suboptimal URLs are going to rank well or convert searchers into clickers. Here at Netconcepts, weโve conducted countless tests using our GravityStream platform, proving to ourselves and to our clients that optimized URLs consistently outperform unoptimized URLs. Given that, here are some general best practices for URLs that we believe hold true:
- The fewer the parameters in your dynamic URL, the better. One or two parameters is much better than seven or eight. Avoid superfluous/nonessential parameters like tracking codes.
- A static looking URL (containing no ampersands, equals signs, or question marks) is more search optimal than a dynamic one.
- Having keywords in the URL is more optimal than no keywords.
- A keyword in the filename portion of the URL is more beneficial than in a directory/subdirectory name.
- Hyphens are the preferred word separator, although underscores are gaining acceptance over times past . So if you have multiple-word keyword phrases in your URLs, Iโd recommend using dashes to separate them.
- Stuffing too many keywords in the URL looks spammy. Three, four, or five words in a URL looks perfectly normal. A little longer and it starts to look worse to Google, according to Matt Cutts.
- The domain name is not a good place for multiple hyphens, as it can make your URL look spammy. Although that said, sometimes a domain name should have a hyphen, as the domain faux pas โarsecommerce.comโ demonstrates (you may not get this joke if you donโt recognize Queenโs English!).
Given the above, itโs absolutely worthwhile to rewrite your dynamic URLs to make them appear static and to include keyword phrases with hyphens separating the words (done within reason). So a targeted search term of โblue widgetsโ would be represented as โblue-widgetsโ in the URL. Bare spaces cannot be used in URLs, so some โwhite spaceโ character needs to be usedโeither the + (plus sign) or the character encoding for a space %20. Iโm not a fan of using the character-encoded version, as itโs not quite as pretty: โblue%20widgetsโ.
The above best practices are generally accepted. It gets a lot more contentious when talking about stability/permanence of your URLs. The general school of thought is that stable is better. In other words, decide on an optimal URL for a page and stick with it for the long haul. We have a different view: URLs can be as fluid as a title tag.
In our view, URLs can be experimented with and optimized iteratively over time, just like any other on-page factor. Why would you โset it and forget itโ when it comes to your URLs when you donโt do that with your titles, H1 headlines, page copy, and internal linking structure? For example, all the following hypothetical URLs follow best practicesโwith the exception of the first URL, of course, which is actually the real URL; now which one will perform the best?
- http://www.homedepot.com/webapp/wcs/stores/servlet/Navigation? storeId=10051&N=10000003+90401+525285&langId=-1&catalogId=10053&Ntk= AllProps&cm_sp=Navigation-_-GlobalHeader-_-TopNav-_-Appliances-_-Dehumidifiers
- http://www.homedepot.com/webapp/stores/50364/100053.html
- http://www.homedepot.com/Appliances/Dehumidifiers/
- http://www.homedepot.com/Appliances/Dehumidifiers.html
- http://www.homedepot.com/Appliances-Dehumidifiers.html
- http://www.homedepot.com/Dehumidifiers-Appliances.html
- http://appliances.homedepot.com/Dehumidifiers.html
The only way to know for sure is to test.
If your CMS or ecommerce platform supports having URLs that are malleable, then why not exploit that capability and embark on a regimen of testing and continuous improvement? WordPress supports this fairly well by automatically 301 redirecting requests for old permalink URLs, once the โpost slugโ for that post has been changed in the admin. Unfortunately, most ecommerce platforms do not support such a capability. When sites are stymied by their platform, the only options are to replace your CMS with one that supports malleable URLs, customize the CMS to support it (assuming you have access to the source code), or put a layer on top of your CMS by using an SEO proxy technology like GravityStream.
Regardless of how you accomplish continuous URL optimization, the MarketingSherpa study shows complacency when it comes to iterative testing and improvement of your URLs (or any other on-page factor, for that matter) results in more traffic going to your competitorโs listings. This is fatal to your natural search program.



