Is your site ranking rank? Do a site review – Part 3 (SEM 101)

Let's continue our run-down of issues to consider in a site review. In Part 1 of this series, we looked at the whats and whys for doing a site review, and covered baselining pre-optimized performance and gathering tools. Part 2 covered important but often overlooked on-page issues that, if not properly addressed, can prove detrimental to a site's performance, both for search engine ranking as well as for usability and discoverability for users. In this post, let's examine site review issues that are more site-wide in scope.

What is the meaning of this (file name)?

Look at your URLs - what do they say about the page's content? Do you use human-friendly page file names or globally unique identifier (GUID)-based gibberish? While bots may have no particular grievance with irrelevantly named pages in URLs, you may be missing an opportunity. Using one or two of your targeted keywords in a file name can help associate those words to your page in the eyes of the search bot. Of course, if you simply crunch those keywords all together, that might not be as helpful as you want. CamelCasing words in file names is of no value to SEO, and the ability to parse individual keywords out of a concatenated string of letters may be beyond the reliable ability of a bot. So how do you assist with the parsing process so that you get full value of any keywords used in a file name (and thus in a URL)?

Using underscores in a file name is not necessarily a good idea. Old style programming code used underscores as a concatenating device, so while it me be parsable to the human eye, that may not be the case for the search engine. Besides, since it's common practice to format hyperlinks in text with underlines, underscore characters in link text may mistakenly appear to be spaces. And forget about using space characters. But there is a reasonable solution you can use.

Adding hyphens between words in a file name, as long as this technique is used in moderation, is perfectly acceptable. Moderation is suggested because of the inclination of some folks to push on boundaries well into the realm of web spam. But when used in moderation, hyphens parse individual words just fine for both human readers and search bots. You might even try this for your next domain name (but shorter is usually better here, so using any more than one or two hyphens should indicate that the webmaster should rethink the proposed domain name in a shorter version).

Site review task: Review the file-naming scheme used for page files to see if there are opportunities to use your targeted keywords. To consider the use of keyword phrases, look into employing hyphens to parse the text into discrete keywords.

Absolutely ban relative links

Are your intra-site links relative or absolute? Either form technically works just fine in the eyes of a search engine, but the format of your links can make a big difference to you for other reasons. Content security alone may be reason enough for some folks to go with absolute links.

The difference between the two forms is whether or not the reference to the web server of origin is always maintained within the link URL. Relative links, as used in the sample href attribute of the anchor tag -- "/sales/today.htm" -- assume the server's domain name and only provide the portion of a link's path beyond the website's root (if even that much of a path), whereas absolute links provide the full URL in the link.

The use of relative links can become a problem when page content is used out of the context of the original webpage (thus from a different URL). When used out of their original context, relative links will be broken and the source content they refer to will not be available. This can be a problem when users cut and paste content from your site into another document, such as an email or, sadly, screen scrape it into their own website. If you use absolute links, the links will never be broken. Interestingly, it's pretty common that folks who are too lazy to create their own content (meaning those who instead screen scrape - aka steal - that content from others) will also be too lazy to check the inline links in that content. It's a bit of poetic justice to at least get inbound link credit from those plagiaristic sites back to your own work! (And in one case recently, our team used the absolute links to Site A embedded in content found in Site B as evidence for Site A's  claim that Site B had stolen their original content. As a result, the offending Site B was penalized for improperly duplicating the content!).

To be sure your links always refer back to your website and your linked content is always available (and thus not context-dependent), use the entire URL as the link path -- an absolute link. Just remember this one caveat: if you move a content page to a different directory on your site, you'll need to update all of your hard-coded URL links going to that page (of course, the outbound links from that moved page will remain valid!).

Absolute links can also be better for helping establish the preferred URL for your site (known as canonicalization). Some webmasters use multiple domain names with fully populated, identical content pages, which can lead to duplicate content confusion and ranking dilution. Always using absolute links to the primary source URL across will contribute to canonicalizing the content. In addition, using relative links when you have multiple URLs pointing to the same content, such as with HTTP: and HTTPS:, can also lead to duplicate content confusion for the search engine (we'll get deeper into canonicalization issues later on in this series).

Site review task: Review your intra-site links, including your site navigation scheme, to be sure they are formatted as absolute links.

Getting from here to there within your site

How do users navigate between the pages on your site? Do you provide a clearly understandable intra-site navigation scheme? Does your site navigation rely on scripted processes or linked images (neither of which the bot can see)? Is every page on your site linked to at least one other page? Do you provide an HTML sitemap page linking to every page of your site? These are some of the questions you need to ask about your site.

Remember back in Part 2 of this series when you were asked to look at your site with a text-only browser (either configuring your browser to disable images, script, and technologies like Silverlight and Flash or use a tool like SEO-Browser) to see your site the way that search bots do? Repeating this exercise here will reveal whether your current intra-site navigation scheme works to the benefit or the detriment of search crawlers (and thus to indexing potential). Can you navigate to other pages on your site in this view? If your site uses images as links, do they include keyword-rich alt text attributes? If your site's navigation design accounts for down-level users, that's also very useful to search bots. Otherwise, unless your other pages receive deep, inbound links from external websites, they may never be discovered by the bots. And if they can't be seen, they can't be crawled and indexed.

Site review task: Review your intra-site navigation scheme for down-level visibility, comprehensiveness, and keyword usage.

Link to yourself (when appropriate)

Do you have written content in the body text of your pages that references the content found on the other pages of your site? You should. That way, you can find logical ways to reference that content and then link to it inline within your pages. Optimally those inline body text links will use the relevant, targeted keywords you established in Part 2 (see how I did that? You should do the same).

One thing to note, however, when creating links to content that is of no value to searchers, such as sign in pages, shopping carts, "print this page" links, and the like. Add the rel=”nofollow” attribute to the anchor (<a>) tag of such links to prevent the crawling and possible indexing of valueless content. Make every link count, and exclude those that don't.

Site review task: Review your content pages for relevant opportunities to link inline to other pages on your site. Use rel=”nofollow” anchor tag attribute for links to content you don't need indexed.

Let relevance drive your outbound linking strategy

Links matter. Links to other pages represent a de-facto endorsement of the page to which you are linking. However, the relevance of the pages you link to matters as well. If the theme of your site is about collecting ancient coins, would it make sense for you to link out to dozens of sites that promote herbal body enhancements and instant college diplomas? (No.) It's not likely this'll make sense to your users, and thus it doesn't make sense to search engines, either (this holds true for inbound links coming from those irrelevant sites, too). Outbound links are good things to have, but they should be relevant to the subject of the pages you have them on. If hundreds or even thousands of non-relevant links appear on a website, this can look like link-level web spam, and that's not a good thing. For more information on link-level web spam, see the blog post The liability of loathsome, link-level web spam (SEM 101).

Remember: if the outbound link is valuable and relevant to your users, it'll be considered valuable to search engines.

Site review task: Check your external, outbound links for relevance.

Gimme some inbound links (but banish link spam)

We previously discussed that outbound links on your site represent your endorsement of the sites you link to, so they are often considered to carry value with search engines. Inbound links from other sites to yours carry value in the same way. And, like outbound links, the relevance of the links does matter. Some webmasters try to quickly (aka fraudulently) build search engine ranking cachet by buying inbound link value from spammy link farms.

Paid link farms aren't helpful here, and in fact, usually prove to be detrimental to ranking if the link farm is determined to be a purveyor of web spam. To truly create value for your site, you need to do the hard work of creating valuable content for your target audience, followed by evangelizing your site to that targeted community of users with whom you wish to connect. Ask for links back to your site from webmasters running legitimate and respected sites on the same subject matter. That way you build up more and better legitimate (aka organic) inbound links. These will prove to be the most valuable links you can get.

For more information on earning valuable inbound links, see the blog post Link building for smart webmasters (no dummies here) (SEM 101).

Another way to get more inbound links is to consider using developing a Pay-Per-Click search advertising campaign. This can earn your site instant visibility in the search engine results pages (SERPs) when your organic ranking is not yet up to snuff. To be clear, buying PPC ads will have no effect on your site's organic ranking, but PPC ads do show up on Page 1 of the SERP, and if your ultimate goal is to increase conversions, well-designed PPC ads can contribute mightily to that effort. For more information about starting a PPC ad campaign in Bing, check out Microsoft Advertising.

Site review task: Check your inbound links to see if they come from paid link farm services. If you have inbound links from a non-relevant link farm or link exchange, which are typically seen by search bots as deceptive attempts to artificially elevate the rank of a website, disengage from that service and check to see if any penalties have been implemented against your site. For more information on penalties, see the blog post Getting out of the penalty box.

If you have any questions, comments, or suggestions, feel free to post them in our SEM forum. Coming up next: We'll continue our look at some site-wide issues that interfere with optimal ranking. Until next time...

-- Rick DeJarnette, Bing Webmaster Center