Usability Archives - Stephan Spencer

The “Hidden” Text of a Title Tag: An Exploit for Search Engine Spammers?

By | Search Engines, Usability | One Comment

The most important on-page factor for SEO is the title tag — that bit of text between the <title></title> tags. There are already some outstanding articles out there on how to craft successful title tags — specifically, Netconcepts’ own Brian Brown has a two-part guide on successful title tag strategies (part 1 and part 2) which just came out recently, and I posted some quick title tag tips a while back.

You may have your title-building strategy down, but have you given serious consideration to title length restrictions? Titles look different to spiders than they do to humans. SERPs only display a maximum of 65 title characters, so that’s all visitors will see, but search spiders record up to 120 characters or more. Some initial tests on lengthy Amazon.com titles reveals that some engines count keywords long past the 120 character mark, opening up an unfortunate opportunity for exploitation by spammers.

The people who find you on any SERP (search engine result page) will make their click decision based heavily on the first 65 characters of your title. Though the rest of the snippet (often times taken from your meta description tag) will also play a significant role, the title is the most influential piece of your search listing. Oddly, the title is ephemeral from the visitor’s perspective; once on your page, most visitors will quickly forget about the title, which is relegated to the top border of the browser. Search spiders, on the other hand, have traditionally listed 120 as the maximum number of characters they will index.

If spammers are not yet taking advantage of these differing limitations by putting a normal, user-oriented title into the first 65 characters and tons of keyword spamglish into the remaining 55, then I’m sure they will be.

This assumes that search engines give any credence to those last 55 characters (and that they are truly limited to 120 characters maximum, which at least Google no longer seems to be), knowing that searchers will not be able to see them in SERPs, and that the title is only important to visitors before they click a link.

Or perhaps this “hidden” portion is completely discounted?

If not, I anticipate that search engines will adjust their algorithms to count only the characters viewable to searchers. At least we can hope that search engines will address the issue before it becomes common practice to keyword stuff the “hidden” segment of page titles.

Sitemaps: What are they good for?

By | Search Engines, Usability | One Comment

Sitemaps of the XML variety and the site maps of the HTML variety: two great tastes that taste great together!

The XML sitemap is the one that you typically use an automated tool to generate (such as one of these) and to upload to your root directory. Humans never see this document and all it really is a simple list of all of your URLs. This XML file allows spiders to “discover” all of your pages and index them quickly. Having an XML sitemap isn’t necessary if you have a good internal hierarchical linking structure and spider-friendly URLs. But if you don’t, an XML sitemap is an easy band-aid. (Note that this doesn’t excuse you from fixing your URLs and linking structure!)

As for the HTML site map, it’s just as important as the XML sitemap – if not more so. The site map page is not just a simple linked list of all your pages – but an easy way to navigate to your most important ones, remember, your human readers WILL see the HTML site map.

If you are not a front end developer and do not know the first thing about usability, fear not – you can still have a very easy way to navigate your pages for your new and returning visitors. Often your returning traffic is looking for a specific page or article, and if they can’t find it quickly, they will go elsewhere to find the information they are looking for.

HTML site maps are also a great addition to your custom 404 error page. When someone makes a mistake while typing in the URL and gets a 404 instead of the page they are looking for, they can navigate to the one they want quickly and conveniently. Without a link to your HTML site map, your users can become frustrated when they find a 404 error, forcing them to press the back button away from your site or just doing another search on Google and likely finding some other site.

Macaroni and Spam

By | Content, Search Engines, Usability, Web Design | 2 Comments

With two natural listings in the top 10 on the Google SERPs for “dating” it’s hard to argue with Match.com’s SEO tactics. It works well for them – a flashy front page with a novel of text below the fold. Since this has worked so well for Match.com and has been talked about on several popular blogs it seems that others are following suit and using this same format.

I came across Patagonia.com recently and low and behold I found a near replica of Match.com’s tactic – An image and a simple selection form. Scroll down a little, however, and we find keyword-stuffed gibberish text and lots of it. This is disturbing because it feels lazy. Is this the future marriage of usability and SEO? It works, it is easy to duplicate and one doesn’t even need to write good content to get decent results. The only thing this tactic requires is a bare-bones layout built on a foundation of spam.

keyword stuffing screenshot

My instinct tells me that this tactic will fall out of favor with Google in the near future as the spiders advance and learn how to detect it. Until then, however, I expect this trend to continue to grow as more and more snake-oil SEO’s fall in line with what Match.com has made popular.

Web 2.0 Isn’t Friendly to the Search Engines

By | Usability, Web Design | No Comments

Two of the most popular Web 2.0 interactive elements, Asynchronous JavaScript and XML (AJAX) and Flash, might be great for customers and a fresh experience on many sites, but they are inherently unfriendly to the major search engine spiders. In my article on Search Engine Land entitled, “The Search Engine Unfriendliness Of Web 2.0” I cover AJAX and Flash in detail, to show you how to prevent these new technologies from harming your ability to get the most out of Web 2.0.

Here are a few quotes from the article that might help those of you who employ AJAX and Flash into your blogs or websites. This next quote covers a great tip about Flash:

Google isn’t likely to make big improvements on how it crawls, indexes and ranks Flash files anytime soon. So, it’s in your hands to either replace those Flash elements with a more accessible alternative like CSS/DHTML or to employ a Web design approach known as “progressive enhancement,” whereby designs are layered in a concatenated manner to provide an alternative experience for non-Flash users. This way, all users, including search engine spiders, will be able to access your content and functionality.

In this quote, I talk about progressive enhancement’s alternative to work with AJAX:

Here, progressive enhancement renders a non-JavaScript version of the AJAX application for spiders and JavaScript-incapable browsers. A low-tech alternative to progressive enhancement is to place an HTML version of your AJAX application within noscript tags (see TheCleanerMovie.com for an example).

For more tips about how you can incorporate progressive enhancement, feel free to visit my article.

Tag your Blogs and Company Sites for Users and SEO

By | Blogging, Content, Usability | No Comments

Tagging isn’t just a tool for usability (even though it’s typically mostly thought of in those terms), it’s also a powerful weapon for search engine optimization. That’s because tagging allows you to rejig your internal hierarchical linking structure, flowing the link juice more strategically throughout your site. And because those links are textual and keyword-rich, a tag cloud is far superior in terms of SEO to the traditional graphical navigation bar.
When tagging is applied to a website, such as a blog, it can significantly increase the site’s traffic by achieving visibility for a much larger array of search terms.

The above quote is from my recent Search Engine land article entitled, “Effective Tagging For Both Usability & SEO.” I go into a lot of details how strategic tagging can help you. Here is a tip about tag clouds that I’d like to share with you:

    Tag Clouds: When you tag your blog or website, the items are then put into an organized, keyword catalog. By taking those tags, you can organize them into a “tag cloud,” which shows keyword topic popularity by the size and sometimes color of the font. Tag clouds enable you to force a new navigation styles for your site or blog based on keyword popularity, and also help your website look up-to-date with enhanced, Web 2.0 functionality. (For an example of a tag cloud, you can see one at the end of my blog.)

For other, more specific tagging techniques, I hope you visit my article. :)

Printer-Friendly vs. Search Engine-Friendly

By | Usability, Web Design | One Comment

We’ve all experienced frustration trying to print out an important web page or form. Some web designers have felt our pain, creating duplicate pages that are “print-friendly.” Unfortunately, these duplicates aren’t great for SEO, as the search engines get confused trying to determine which version of your content to serve up to searchers in their results. There are other negative effects as well, depending upon the size of your site and how you’ve structured it. For example, in my article at CNET I highlight this scenario:

For example, let’s say that you have a Web site that has 1,000 pages, a small to moderate-size site, depending on your perspective. Now, because you’ve taken advantage of your CMS’ ability to automatically create a “print this” link on each page to a printer-friendly version, for all practical purposes, your site just doubled to 2,000 pages. But what if your PageRank isn’t high enough to warrant very rapid spidering? It could take a lot longer for all your pages to get indexed.

For more about this unique situation, and solutions on how to avoid potential duplicate content issues, read my blog post on CNet: Searchlight.

Good Progressive Enhancement is Great for SEO

By | Usability, Web Design | No Comments

Progressive enhancement, a “sister” to graceful degradation, is often talked about in web design circles right along with other technical topics like JavaScript, Flash, Ajax, and basic applications of CSS. While it may not be easy to understand what these technical functions are, not using them results in poor visibility. Blank pages, that both search engines and visitors come across on your site, often create poor experiences. In my post on CNet I wrote:

Graceful degradation was a step to overcome this, where sites were designed for the latest browsers and technology, but were also made to degrade gracefully, hopefully delivering most of the content to the visitor, or at least informing the visitor that he or she may not be “getting” everything.

For more about progressive enhancement, graceful degradation, and how they related to SEO, visit my CNET: Searchlight blog for more on this topic.

Multi touch displays – oooh I want one!

By | Usability | 2 Comments

I am a diehard Mac fan, so you can bet that I watched Steve Jobs’ keynote from start to finish (Well, except for the part at the end where John Mayer came out on stage and sang. That didn’t do much for me.)

Wow, that iPhone was pretty darned cool. I can’t wait to get one. But there’s something else I want even more. Read on and I’ll tell you about it…

In his keynote, Steve talked about a revolutionary new technology in the iPhone called the multi-touch display. It allows you to stretch and shrink photos with your thumb and forefinger, flip through album covers with a swish of your finger, etc. Well, that wasn’t the first time I had heard of this technology. There was an amazing video circulating around on YouTube last year where New York University researcher Jeff Han showed off the latest in multitouch displays. It could really revolutionize the way that we interface with our computer. I SOOO want one! Here is the video, have a look:

After you have watched that, you may also enjoy watching a video on Jeff’s website which shows off some more cool demonstrations.

Have you ever seen a computer user interface more intuitive?! I wonder how long it will take before we each have one of these on our desk…

Getting the balance right between SEO and usability

By | Search Engines, Usability | One Comment

Finding the right balance between SEO and a usability can sometimes be a challenge. The two strategies can conflict and companies may mistakenly favor one over the other. For example, one company may choose to stuff the same keywords into every alt tag in their navigation graphics. That, of course, detracts from the user experience, making the page slower to load and making the page difficult to interpret for the visually impaired who rely on screen readers to read web pages to them.

Then there are others who try to maximize usability without any concern for SEO. They choose to “Googleize” their home page, stripping all non-essential elements out of the page and making it as simple and streamlined as Google’s home page. That, unfortunately, offers very little for the search engines to “sink their teeth into,” and consequently insufficient clues for the search engine to identify appropriate keyword themes for your page.

Here’s another way to think of it: search engine spiders are another type of “disabled” visitor — one that can’t read what’s inside your images, fill out your web forms, or interact with the Flash, Java, JavaScript, AJAX etc. on your pages. Therefore, usability and accessibility of your content to spiders is a requirement if you want good search engine rankings. You kill two birds with one stone by optimizing your site’s usability.

In my estimation, usability should come first. An unusable website won’t generate an adequate ROI, even if it ranks well in the engines.

Then there are the sites that miss the mark on both counts — usability and SEO. Consider Nike.com, which just got picked apart for its SEO mistakes in an article on MarketingProfs this week. (I just blogged a quick summary of the article here.) I agree that Nike.com misses the mark in regards to search, and I also find the site severely lacking when it comes to usability and accessibility, IMHO.

I have an article called “Usability and Findability — Getting the Synergy Right” in this month’s issue of Intercom, the magazine of the Society for Technical Communication. If you’re a STC member, you should check it out.

Ecommerce Best Practice Tip #10 – Incorporate “breadcrumb navigation”

By | Ecommerce, Online Retail, Search Engines, Usability | No Comments

Breadcrumb navigation is wonderful for usability and for search engine optimization (SEO). Breadcrumb navigation, if you’re not familiar with the term, is text-based navigation that shows where in the site hierarchy the currently viewed web page is located. Not only does it give a sense of your location within the site, it provides shortcuts to instantly jump higher up the site hierarchy. A product page for a table lamp may have the breadcrumb navigation of “Home > Home Furnishings > Lighting > Table Lamps”. Below is a screenshot of an actual breadcrumb taken from a page within the website of one of our clients, TriTech:

breadcrumb navigation example

Notice that the breadcrumb above contains text links with relevant keywords in the anchor text. This provides a significant SEO benefit. Let’s take the “Phone Systems” link in the above breadcrumb as an example. The search engines treat that single link as a “vote” for the Phone Systems category page. But more than that, the anchor text (“Phone Systems”) provides the search engines (Google, Yahoo and MSN Search) with an important contextual clue as to the topic of the linked page. That equates to improved rankings.

Contrast that with the use of throwaway phrases like “click here” or “more info” in the anchor text. Such words provide no clues as to the topic of the linked page, for either the search engines or your users. When you use the phrase “click here,” you are telling the engines that the page to which you are linking is all about “click here”.

One throwaway phrase that’s used almost universally within breadcrumbs is “Home”. Try revising that link to something more keyword-rich. Take the “Home” link in TriTech’s breadcrumb above as an example. A more search optimal version of the anchor text would include words like “Computing” or “IT” or “Technology” along with perhaps “Store” or “Products”.

Now consider the amplifying effect of breadcrumb navigation. A link in the breadcrumb will be “voted for” (through internal links) more times if that linked page is higher up in the site hierarchy and if there are more pages underneath that page in the hierarchy. So, through breadcrumbs, a super-category page will receive more internal links than a sub-category page, and a category page for a category covering hundreds of products will receive more internal links than one with only a dozen products in the category.

Make a breadcrumb for the checkout too, to give shoppers a bird’s eye view of the order process and an indication of how much farther they still have to go. Ideally allow them to use the breadcrumb nav to jump around in the order process too, like to change billing or shipping information that they had already supplied in a previous step. Here’s Air New Zealand’s breadcrumb from their booking engine (shrunk a bit to fit):

order process breadcrumb example

Some sites take the visitor’s clickpath into account when building the breadcrumb, rather than relying totally on the absolute site hierarchy. For example, RitzCamera.com will display a different breadcrumb on a CompactFlash memory card product page if you navigated to it from the top-level category of “Memory” versus the top-level category of “Digital Cameras & Accessories”.

This can have implications on the site’s search engine friendliness. How? Well, the user’s breadcrumb trail needs to be passed in some way, and it’s often put in the URL rather than a cookie. If in the URL, that will create multiple copies of near-duplicate pages for the search engines (where the only difference between the pages is a variation in the breadcrumb). The end result is PageRank dilution.

There are several potential workarounds (besides the obvious one of disregarding the user’s clickpath altogether). One is to drop this breadcrumb trail from the URLs of internal links selectively for search engine spiders, through user-agent detection. An alternative workaround is to append the parameter containing the breadcrumb trail to the end of the URL using JavaScript. An example of this technique is REI’s Shop by Brand pages, which append a vcat parameter upon clicking on any of the brand links. Either approach will minimize duplication and aggregate PageRank. Neither approach will eliminate the potential for websites deep-linking to you with the breadcrumb trail parameter included (via copy-and-paste of the URL displayed in their browser’s Location bar).

This all might seem to hard. In fact, implementing breadcrumb navigation AT ALL may be too hard. If that’s the case for you, there’s still a potential path forward, where you can still reap some SEO benefit. I call the approach a “poor man’s breadcrumb”. Basically, you just link to the category that you are in, and that’s it. This approach worked well for our client Guild.com. They didn’t have time to code in the necessary functionality for breadcrumb navigation, so this served us in a pinch. You’ll notice on all pages showing multiple products on a page (example: page 10 of 31 of glass vases) that they all link to their category page, like so:

poor man's breadcrumb example

For this example, that equates to 31 pages voting for the Glass Vases category page with keyword-rich anchor text.

So now you have learned probably more than you ever wanted to learn about breadcrumb navigation! To summarize all of the above: incorporate breadcrumbs into your online catalog and your checkout, try to make the anchor text keyword-rich, and don’t incorporate a spider’s clickpath into your breadcrumb if you can at all help it.