On Tuesday I gave two presentations at the Shop.org Annual Summit. First was “Natural Search Tactics for the Retailer” with fellow panelists Ken Jurina from Epiar, Jenny Schlueter from Dell, and Ian McAnerin from McAnerin International. It was a very tactical session – focused on tools, tips and techniques. Ken covered keyword research, Jenny covered content optimization, Ian covered technical optimization, and I covered link building.
Some are a few key points from the session…
- Mine multiple data sources for keyword data, such as Google Insights for Search, WordTracker, Yahoo Panama Keyword Tool, Google Keyword Tool, MSN Keyword Forecast, Trellian Keyword Discovery, Google Webmaster Tools, Wordze, Google Suggest, Google Traffic Estimator, Google Trends, Hitwise, NicheBOT, SEO Book Keyword Tool, GoodKeywords.com, internal site search logs, referrer logs, and PPC broad match. I’d add to Ken’s list: comScore Marketer and WordPot.
- Determine keyword difficulty with the SEOmoz Keyword Difficulty Tool, InternetMarketingNinjas Top Ten Analysis Tool, and KEI from KeywordDiscovery or WordTracker.
- Useful link analysis/link building tools include Back Link Analyzer, Thumbshots Ranking tool, TouchGraph, SEOChat PageRank Lookup and SEO for Firefox extension.
- Types of links that are likely to get discounted include: reciprocal links, affiliated sites (on the same IP range or hostname), footer links (at the bottom of the page), site-wide links, links contained on a page called links.htm / links.asp, and links with the exact same anchor text. Remember that the more links on the linking page, the less PageRank you’ll get.
- Review your existing links using the Back Link Analyzer and contact those webmasters who link to you with suboptimal anchor text. Focus on the highest value links where you have rapport or influence with the webmaster.
- If you have multiple servers, rotating IP addresses (load balancing) can make it look to Google like you have duplicate copies of your site, and edge computing can cause geolocation issues. The fix is to detect spiders and send them to a canonical site version.
- Use the country code TLD for a country, use a subdomain for language or major group, and use subdirectories for topics. e.g. language.company.ccTLD/topic/page.htm
- A gTLD (.com, .net) is almost always geolocated via IP address. A ccTLD (.ws, .la, .tv) overrides IP geolocation. The duplication problem does not affect clearly geolocated sites. Always declare language and character types on web pages.
- Scripts that build links on the fly, AJAX, and Flash are bad for SEO. An Iframe is treated as a separate page; includes are better. Use a spider simulator such as SEO-Browser to test your site.
- Search engines strip most HTML code out of a document before parsing, so most HTML validation errors do not affect rankings. Exception: some errors, such as a missing “<” can kill the indexing of your page. Best Practice is to always validate your code.
The Powerpoint, which includes all 4 presentations, is available for download here.
The second session I presented was with Amy Africa and it was a site clinic session where Amy and I did impromptu critiques of audience members’ websites. Amy covered usability and conversion; I covered SEO. It was a lot of fun. There were no Powerpoints for that session.