The old adage that was once attributed to a Chinese proverb tells us, “A picture is worth a thousand words.” But John McCarthy, a noted mathematician and computer science pioneer, stated conversely, “As the Chinese say, 1001 words is worth more than a picture.” Perhaps McCarthy was also a pioneer of search engine optimization (SEO)!
The use of interesting and relevant images and animations on a website can certainly elevate the user experience. But it is the wise webmaster who knows when and how to use these technologies to enhance both the user and the search indexing experience. A less than wise approach might cause a site to falter in the search engine results pages (SERPs), potentially all the way to the point of dropping out of sight.
Case in point: I have a friend (who shall remain anonymous) who owns a small service business. He has very limited resources, has only a few employees, and every one of them are very busy focusing on their daily job activities. This friend is not very web-savvy, so when he was approached by a graphic artist proposing to revamp his old website, he found her work to be visually striking and agreed to the job. Unfortunately, neither my friend nor his graphic artist vendor knows anything to speak of about SEO.
The graphic artist took his former tried-and-true (but perhaps a bit utilitarian) website and turned it into an eye-popping, graphic-and animation-intense wonderland of coolness. He was thrilled, the artist made a bit of extra dough, and all was right with the world. Right? Well, not so fast.
Over time, he began to notice that it was getting harder to find his site in search engines. He ignored it for a while, directing his existing customers to his same old URL directly (luckily for him, his business is not really dependent upon being found in search, so it was not a lights-out failure at first blush). When he eventually asked me for some suggestions on what he could do with his website to get a better rank, I took a quick peek at his new home page’s source code.
There was nothing there. By nothing, I do not mean that the home page had nothing in it. It had a TON of scripting code, references to images and script calls, but absolutely no text content or links. All of the content he now has (and it is sparse anyway after the redesign) was off-page, accessed solely through script calls. The content in the <body> tag was nothing but a table filled with spacer GIFs or background images. There was not one shred of text to be found. As the search engine web crawler (aka bot) saw things, the home page was literally devoid of any content. No wonder he dropped out of the SERPs! There was nothing to index, no links to other pages, and no keywords on which to base a relevant query!
While this is an extreme case, this does illustrate a key point I’ve made before. The bot cannot see what you and I see. It can’t easily read words in images or animations or anything controlled by script. So what do you do? First of all, don’t over-react. Don’t purge your site of all non-text content. Content variety, quality, and relevance are keys to drawing both new and repeat visitors. You can make your content more compelling with the right combination of visual interest and textual relevance. So yes, use images, even animations, if you wish. But always keep the end user in mind – if it adds value to the user’s experience on your site, that’s great. Just make sure it is also accessible by the bot!
The key here is not only using visual content in moderation (and never at the expense of textual content), but thinking strategically about it. Follow these recommendations for a more SEO-savvy use of visual content on your website.
Don’t put important text content into graphics. The search engine bot can’t see image file content as you and I can. To the bot, references to images are merely links to external, binary files in the HTML code, the contents of which can’t be indexed. Keep your page’s important content (the stuff you want indexed so it is discoverable by search engines) in text form within the webpage. Use images instead for illustrations, photos, designs, and other, non-text elements on the page.
This especially goes for key business information, such as your company name, business address, and other critical info you want indexed (and customers to find). How many times have we all seen sites that use logo images containing all of their company information? And the owners of those sites wonder why they don’t show up in local search results. Tsk, tsk.
The same recommendation goes for animations in the form of Flash and Silverlight. These are great technologies to use to present videos, instructional information, entertainment, and much more. But don’t bury your useful text content within animations. Use them as enhancements of, not replacements for, your on-page, text-based content.
Now truth be told, the bots used by search engines are constantly evolving. They are getting smarter all the time. In the Bing for Webmasters white paper I wrote a few weeks back, I even highlighted an example of how Bing’s bot is able, in a limited form, to extract some textual content from Flash animations. So, yes, it can be done, and it is being done when possible. However, it’s not easy to do and it’s not always successful. As a result, it’s pure folly to rely on this emerging data extraction technology as your primary method of indexing information on your site. It’s too big a gamble for you to take. If you do so and the bot can’t extract your content from Flash, that animation will be nothing but a digital black hole, and your site will not get the content indexed that you surely want included.
Both Microsoft (for Silverlight) and Adobe (for Flash) offer information for optimizing data extraction from their respective technologies, and it is probably worth a look if you want to use them on your site. One last thought on this: if you must use these technologies and they must include text content, just in case data extraction is possible, use relevant keywords for your page in that text. Search bot technologies continue to improve over time, and you may be lucky enough to have usable content extracted from it. If so, make the content worthy of extraction—just don’t depend on these technologies to be the sole source of usage for those keywords!
Use descriptive file names for images. Always look for creative ways to legitimately inject keywords into your webpages. While textual content in image files can’t be read by the bot, the bot can see the content in the tags you use. The <img> tag is a prime spot to inject a little keyword juice. Instead of naming your image files with a generic name like Art001.jpg, consider its contents and its intended use on the page to capture another keyword. If your site is for an Italian restaurant, wouldn’t an image file called lasagna-dinner.jpg be a more descriptive name? It would be for a bot.
And, of course, for every image file you embed in your pages, you need to add the alt attribute to the <img> tag using keywords describing the image within the context of the page. Some other cool tips for using image tags include:
- Limit the length of alt attribute value text to no more than 100-150 characters.
- Unless the image is of a company or brand logo, keep brand names out of alt attribute text (or at least push the brand name to the end). It’s better to use clear, clean, and keyword-rich explanations of what the image is about.
- Don’t begin alt attribute text with either the symbol “©” or the word “copyright”. The use of either at the beginning of the alt attribute value text will incorrectly indicate to the bot that the image is highly related to a copyright. Use your page’s keywords instead.
- Implement a lower case naming protocol in your code for images. XHTML requires that tags be written in lower case, so getting into the habit of using lower case in your tag code is a best practice.
Don’t use script or images for site navigation. Bots can’t read script. They can’t execute it to see what happens. That’s why you need to be judicious in your use of script on your site. If the availability of important content is reliant on script execution, this is bad news in terms of getting the site effectively crawled by the bot. And if you’ve done the unthinkable and put all of your site navigation in sophisticated scripts, the bot won’t be able to follow any of those links to your other pages (which you’ve probably noticed by now). That means none of your site’s other pages will be crawled, and thus won’t make it into the index. And that’s a serious SEO problem if I’ve ever heard one!
Speaking of navigation, it’s also unwise, for the same reasons already mentioned earlier, to use image files for navigational menu items. The bot can’t read the text in the images. So these important page description elements, which are the most significant identifiers of the content on your site’s pages (they are prime spots to use your best keywords), are left as empty holes to the bot. What a wasted opportunity to score some keyword relevance to those pages! Don’t let this happen to you! Using major keywords for each page in those images’ alt attributes can help offset this some, but if you have a design choice before publishing the site, go with straight text here!
If you have any questions, comments, or suggestions, feel free to post them in our SEM forum. Until next time…
– Rick DeJarnette, Bing Webmaster Center