Web 2.0 Expo: SEO for Web Development Presentation

I would like to thank everyone for taking the time to check out my session at Web 2.0 Expo here in New York City. I’ve included a link to my slides below, along with answers to all the questions that were posted that I did not get a chance to answer during the talk. Please feel free to contact me if you have any questions.

 

Advanced SEO for Web Developers
View SlideShare presentation or Upload your own. (tags: seo web)

 

 

Download Presentation: Web_20_NYC_2008.pptx (6.88 mb)

Here are answers to the questions asked online after the session:

How important is XHTML and 508 conformance to search engine optimization (kpande)

For folks who might not know, XHTML is one of several standard formats for creating websites, and 508 compliance refers to a standard created by the government in 1998 to ensure that websites were accessible to people with disabilities.

I generally recommend developers validate their websites to some standard (you can use this tool: http://validator.w3.org/), but which standard you choose shouldn’t make a whole lot of difference (within reason!). This will help you avoid a lot of simple syntax errors on your site that may make it difficult to render by search engines. On the webmaster tools development team, we strive for 100% compliance to XHTML as our own standard.

508 compliance is a little bit more complex, here’s the section that deals specifically with websites. Generally these guidelines are all good design principals and they generally align directly to best practices in SEO. IE and Firefox also help enable many of these scenarios on your behalf, so the more you can do to use HTML semantically, and comply with industry best practices the better. Here’s a list of recommendations from reading the spec:

  • Alt text for Images – this should describe what’s in the image, not necessarily be a list of keywords for search engines.
  • Don’t override use preferences – things like using absolute font sizes could make it more difficult for the browser to resize the text. This doesn’t necessarily impact SEO, but still good to do for your customers and an industry best practice. I recommend Bulletproof Web Design for more information.
  • Beware of Javascript – folks sometimes use javascript to enable non-standard interactions in form elements and on-page interactions. The user should be able to achieve the same types of interactions with out this javascript executing.
  • Page should be readable without a stylesheet – pretty self explanatory, but this is a test I do on every website I review.
  • Text only, down-level experience for all functionality – if your site exists in an extreme AJAX, Flash or Silverlight form, you should also provide a down level experience that is more accessible. We recommend you deliver this experience at the same URL so you don’t create any duplicate content issues or increase the complexity of your site.

Is there such a thing as too much content on a single domain from a PR/SEO standpoint? (kpande)

Not if it is unique, high quality content! Search engines have a voracious appetite for good content, so you will generally not run into any issues by having a really deep site with lots of great content. That said, there are a few things you can do for content rich sites to ensure you’re getting the best crawling from search engines:

  • Make sure you have a good information architecture – this will help both your customers and search engines understand how your site is structured and make it easy to navigate. A good reference book on this subject is available from O’Reilly: http://oreilly.com/catalog/9780596000356/. A general rule of thumb for architecting a website is to start broad, and get more specific. Amazon.com is a great example of this, their home page allows access to all of the various departments of their website, and as you get deeper and deeper the information becomes more specific. E.g. Home > Books > Computer & Internet > Web Development > Information Architecture.
  • Sitemaps – It is also important to insure that it is easy for both search engines and users to easily navigate through all of the content on your website. By creating a page on your website with links to all major sections of your site, you can make it easier to navigate. You can also create an XML sitemap file using the Sitemap Protocol, and provide the search engines with a comprehensive list of URLs for your website.
  • Subdomains vs Subfolders – another common question is whether subdomains or subfolders should be used to organize content. The answer is that it doesn’t really matter, choose the solution that best fits your technology stack and capabilities.

Can you elaborate on the pros and cons of the <noscript> tag?

The <noscript> tag can be a great way to provide both users and search engines with some information about your page when they do not have JavaScript available. The contents of the noscript tag could provide a description of what the javascript element does, along with a link to a text-only version of the same content. For sophisticated AJAX applications, I would recommend using progressive enhancement techniques (like Hijax)

— Nathan Buggia, Webmaster Center Team