In Part 1 of this series of blog posts on site reviews, we covered the whats and whys of conducting a site review of your website to see if you are ranking where you want to be. If you have compelling content to share with your users, you want them to find your site! After installing and registering to use the many webmaster tools available on the Web (such as the Bing Webmaster Center tools and the Free SEO Toolkit!), you’re ready to start looking at what might be preventing you from ranking where you should be. In this post, we’ll look at several possible, on-page issues.
Focus your aim on targeted keywords
Since search engines rank sites based in large part on their relevance to the keywords used in the search query, identifying your site’s targeted keywords for each page of content is key for assessing how well they are performing with them.
It’s brainstorming time. If you don’t already have any keyword and key phrase lists (really?), develop one for each content page stat. If you already have such lists, challenge your existing assumptions to see if those words and phrases on your lists are still the best keywords for your site. For more information on creating your keyword list, check out our blog post The key to picking the right keywords (SEM 101).
There are many useful keyword research tools available on the Internet (some for free, some not), to help you create and update these important lists. I suggest you consider adding the free keyword development and analysis tool from Microsoft adCenter called Microsoft Advertising Intelligence to your website toolbox.
As you develop your site’s keyword lists, consider whether it might be fruitful to make a run for the less competitive keywords found in the long tail of search rather than the highly competitive few keywords in your industry that are obvious to everyone (including your competitors). I discuss this concept in detail in the blog article Chasing the long tail with keyword research (SEM 101) (that post also discusses how to use the Microsoft Advertising Intelligence tool).
Once you have your keyword lists developed, test your assumptions. Open up several browser windows and go to the search engines your users will use (we suggest Bing for starters! ;-). Run queries on those keywords and see where your site falls in the results. Try different combinations of keywords. Each page of your site should use a few of the keywords from your lists, and every keyword in your collection should be used on at least one page (just avoid using every word on every page!). As part of this review process, look to identify unproductive pages or missed keywords.
In your query tests, note how each keyword affects where your site places in the rankings of each search engine. Are you at the top of the list? Are you on the first page of 10 results? If you’re not where you want to be, by identifying this problem, you’re now on the right track to fixing it.
Site review task: Develop lists of targeted keywords for each page of your site and check your pre-optimization performance for those words. Consider targeting some of the less-often used words in the long tail of search.
Content is coin of the realm
We’ve said it before, and yes, we’re saying it again. You need compelling, original content to rank well in search engine results pages (SERPs). And not just any content, but indexable content. It can’t be tied up in scripts or multimedia technologies that the search bots can’t read. Feed your favorite bot!
Look for opportunities to demonstrate your expertise in your field. Can you write? Develop articles about your experience. Give lists of recommendations (who doesn’t love a relevant Top 10 list?). Are you a photographer? Post you own beautiful images! Are you an artist? Upload samples of your artwork. And don’t stop with just yourself – let user-generated content (UGC), found in forums and blogs, help provide useful information to your readers and help drive valuable inbound links to your site, which in turn will contribute to improved site ranking. It becomes a virtuous circle. And that’s a good thing.
One key point: Remember those keywords we just spoke of earlier? Those words need to be used in the text of your pages. If you’re posting non-text-based, multimedia content, be sure to add keyword-rich meta data to describe it. Searchers will never know what content is on your site if you don’t have a reference to it in text form, be it in <body>, <meta>, or alt text. For more information on developing the content on your site, see our blog post Are you content with your content? (SEM 101).
The key to developing your content successfully is to target human readers. Rich, informative content written for people is what the search bot looks for and values the most. However, specifically targeting search bots in an effort to artificially improve ranking will likely result in the site being deemed web spam. This can result in penalties that can range from lower placements in the SERPs to outright expulsion from the index. For more information on what sort of content is considered web spam, see our blog articles Eggs, bacon, spam, spam, and spam (SEM 101) and The pernicious perfidy of page-level web spam (SEM 101).
Site review task: Check the content on your pages to be sure you use the targeted keywords in the body text. Look for ways to add more original content to your site, including UGC-based pages.
Provide graceful degradation for media content
Why should you care about those down-level users? Because search bots are some of the users who cannot reliably access content using these technologies! And if they can’t see your super-cool content, what’s left for them to see? If the answer is “nothing,” that’s what will be indexed on your site. And that’s probably not what you intended.
Here’s a test: run a web browser with the following conditions:
- Disable automatic loading of images
- Uninstall Silverlight
- Uninstall Flash
Essentially, make your browser a text-only experience. (Alternatively, use a tool such as SEO-Browser to run this test.) Now visit your website. What do you see? Anything? If not, you have some additional content development work to do. What you need to do is add keyword-rich, text content that is presented as a secondary alternative when the primary, super-cool stuff can’t be seen. I covered this concept in depth in the blog post Illuminating the path to SEO for Silverlight.
Site review task: Check your pages employing advanced content presentations to be sure you provide alternative, useful, keyword-rich, text-based (indexable), down-level content.
Heads up on <head> tag content
An important area to examine on every page of your site during your site review is the meta content of the <head> tag, found at the beginning of every HTML page. The <head> tag of every page on your site should always include <title> and <meta> description tags.
The content of each <title> tag should be unique and directly relevant to the content on the page. Same for the <meta> description tag. And while you’re checking, look to see if these tags use some of the targeted keywords for their respective pages (if not, fix that). Bots use the content in these tags to help define the contextual theme of the pages, and thus by definition, the words and phrases used in these tags to describe the pages are identified as keywords for assessing relevance to those pages. If you use generic, boilerplate text (or even no text at all) in these pivotal tags, you are missing a fundamental opportunity to tell the bot what your page is all about.
Not only is the content within these tags considered crucial by bots, they provide significant value to human readers as well. When your site is listed in the SERPs, users will often see the content used in these tags shown in the blue line hyperlink and in the snippet text describing what content your page has to offer. If you’ve overlooked providing compelling and informative descriptions here, no one will ever know about the cool content you have on your site, and thus no one will click the SERP link to visit your site. What a wasted opportunity! Check out a past blog post on this subject at Head’s up on <head> tag optimization (SEM 101).
I have some key tips to remember when writing <title> and <meta> description tags:
- Use your page’s targeted keywords in these tags, but always write the text so it’s logically readable by a person.
- At least for search bots, word order matters. Use your most important keywords first.
- Omit the obvious, generic stuff, especially from the beginning of the tags. Instead, write a concise, pertinent description of the content on the page.
- Never stuff these tags as a keyword dump.
Site review task: Check the <title> and <meta> description tags of each page to be sure they have unique and targeted keyword content in them.
Using your head(er tags)
Another helpful page element to consider for keyword usage is the header tag used within your <body> content – specifically the <h1> tag. You should have not zero, not two or more, but just one <h1> tag in each content page. The text in that tag should ostensibly serve as the top-of-the-visible-page title (not to be confused with the <title> tag in the page code) declaring the theme of your page’s content. Given the tag’s prominence in defining the theme of the page for human readers, its text content has SEO value, so use your targeted keywords here as well. You can also use additional, lower-level header tags, such as <h2> and so on, but these tags are less significant in terms of SEO value.
Site review task: Check your content pages to be sure you are using one <h1> tag in the <body> text and that it uses keywords targeted for that page.
Just as the <h1> tag helps define the theme of a page to the bot, anchor text helps define the theme of a linked page. When you are cross-linking to other pages on your site, don’t waste an opportunity to associate a keyword or two with an entire page! Use keyword-rich descriptive text in your link. Never use meaningless text such as “click here” for anchor text, which conveys nothing about the content of the referenced page.
Site review task: Check your pages to be sure you provide descriptive, keyword-rich text in your anchor (link) text, especially when cross-linking to other pages on your own site.
<img> is everything (if you include alt text)
Do you include images in your page content? Do you use images to convey important text-based content that you want your visitors to read? (We often see vitally important text information, such as the company name or the business address) offered only within images!) If so, that content may be missed by search bots. They cannot read image content the way we can. To help bots understand what is being shown, always use alt attributes in your <img> tags. Using alt attributes enables you to associate your targeted keywords with the content images on your page.
I have a few tips on using images:
- Don’t bother to add alt attribute text to non-content images, such as those that fill space or simply convey color.
- Use descriptive file names (such as corvette.jpg rather than the generic and meaningless image001.jpg) to further convey meaning about the image for the search bot.
- Use the lowest image resolution needed to minimize file size. The goal is to strike a balance between reasonable image quality and page load time.
- For pages that require high-resolution images (especially in large numbers), use linked, low-resolution thumbnails as the default images.
- Never use images as a text replacement for important content you want indexed.
For more information on using images in your site, see the blog post Images and Flash and script, oh my! (SEM 101).
Site review task: Review your pages for the use of images as content to be sure you provide useful, keyword-rich (indexable) text in <img> tag alt attribute.
If you have any questions, comments, or suggestions, feel free to post them in our SEM forum. Coming up next: We’ll look at some site-wide issues that can prevent your site from ranking well. Later…
– Rick DeJarnette, Bing Webmaster Center