Web 2.0 Expo: SEO for Web Development Presentation

I would like to thank everyone for taking the time to check out my session at Web 2.0 Expo here in New York City. I’ve included a link to my slides below, along with answers to all the questions that were posted that I did not get a chance to answer during the talk. Please feel free to contact me if you have any questions.

 

Advanced SEO for Web Developers
View SlideShare presentation or Upload your own. (tags: seo web)

 

 

Download Presentation: Web_20_NYC_2008.pptx (6.88 mb)

Here are answers to the questions asked online after the session:

How important is XHTML and 508 conformance to search engine optimization (kpande)

For folks who might not know, XHTML is one of several standard formats for creating websites, and 508 compliance refers to a standard created by the government in 1998 to ensure that websites were accessible to people with disabilities.

I generally recommend developers validate their websites to some standard (you can use this tool: http://validator.w3.org/), but which standard you choose shouldn’t make a whole lot of difference (within reason!). This will help you avoid a lot of simple syntax errors on your site that may make it difficult to render by search engines. On the webmaster tools development team, we strive for 100% compliance to XHTML as our own standard.

508 compliance is a little bit more complex, here’s the section that deals specifically with websites. Generally these guidelines are all good design principals and they generally align directly to best practices in SEO. IE and Firefox also help enable many of these scenarios on your behalf, so the more you can do to use HTML semantically, and comply with industry best practices the better. Here’s a list of recommendations from reading the spec:

  • Alt text for Images – this should describe what’s in the image, not necessarily be a list of keywords for search engines.
  • Don’t override use preferences – things like using absolute font sizes could make it more difficult for the browser to resize the text. This doesn’t necessarily impact SEO, but still good to do for your customers and an industry best practice. I recommend Bulletproof Web Design for more information.
  • Beware of Javascript – folks sometimes use javascript to enable non-standard interactions in form elements and on-page interactions. The user should be able to achieve the same types of interactions with out this javascript executing.
  • Page should be readable without a stylesheet – pretty self explanatory, but this is a test I do on every website I review.
  • Text only, down-level experience for all functionality – if your site exists in an extreme AJAX, Flash or Silverlight form, you should also provide a down level experience that is more accessible. We recommend you deliver this experience at the same URL so you don’t create any duplicate content issues or increase the complexity of your site.

Is there such a thing as too much content on a single domain from a PR/SEO standpoint? (kpande)

Not if it is unique, high quality content! Search engines have a voracious appetite for good content, so you will generally not run into any issues by having a really deep site with lots of great content. That said, there are a few things you can do for content rich sites to ensure you’re getting the best crawling from search engines:

  • Make sure you have a good information architecture – this will help both your customers and search engines understand how your site is structured and make it easy to navigate. A good reference book on this subject is available from O’Reilly: http://oreilly.com/catalog/9780596000356/. A general rule of thumb for architecting a website is to start broad, and get more specific. Amazon.com is a great example of this, their home page allows access to all of the various departments of their website, and as you get deeper and deeper the information becomes more specific. E.g. Home > Books > Computer & Internet > Web Development > Information Architecture.
  • Sitemaps – It is also important to insure that it is easy for both search engines and users to easily navigate through all of the content on your website. By creating a page on your website with links to all major sections of your site, you can make it easier to navigate. You can also create an XML sitemap file using the Sitemap Protocol, and provide the search engines with a comprehensive list of URLs for your website.
  • Subdomains vs Subfolders – another common question is whether subdomains or subfolders should be used to organize content. The answer is that it doesn’t really matter, choose the solution that best fits your technology stack and capabilities.

Can you elaborate on the pros and cons of the <noscript> tag?

The <noscript> tag can be a great way to provide both users and search engines with some information about your page when they do not have JavaScript available. The contents of the noscript tag could provide a description of what the javascript element does, along with a link to a text-only version of the same content. For sophisticated AJAX applications, I would recommend using progressive enhancement techniques (like Hijax)

– Nathan Buggia, Webmaster Center Team

Join the conversation

17 comments
  1. Anonymous

    Great session, I enjoyed every minute!

  2. Anonymous

    Beautifully done!  Thanks for the great stuff.

  3. Anonymous

    I assume that the presentation is stating to only include one h1 tag, not only one h1 tag, only one h2 tag and only one h3 tag – clarification on that would be nice please.

  4. Anonymous

    Thank you for the information. Leupold found the information to be very useful.

  5. Anonymous

    Thanks for the information.

    I find designing XHTML and Web 2.0 very hard and complicated… anybody has any sites they can suggest where I can learn the basics?

  6. rickdej

    @Ian M – yes, you are correct. You should have 1 <h1> tag, and can have as many <h2>, <h3>, etc tags as you would like on the page.

  7. Anonymous

    I see….I didnt know that <h1> tage should be limited.  There is so much to learn in SEO.

  8. Anonymous

    Nice info!  I always espouse validating code and checking down-level experience as the first step in good SEO.  What’s often good for a cross-compatible user experience is good for SEO.

  9. Anonymous

    Hey Nathan

    Can you tell us when MSN wil fix their fake referer so called ‘anti cloaking’  bot?

    Its been quite some time now, and is awfully annoying.

    Thanks!

  10. Anonymous

    Good work. :)  Thought I’d mention that the Spry library (Adobe) allows you to create a page using HTML data sets and then make it sexier with JS/ajax techniques. No alternate page required. Rawks.  (They’ve not publicized it much — but Greg and I show how to do it in CH6 of our book.)

    Ciao,

    Stef.

  11. Anonymous

    Great post…

    "Page should be readable without a stylesheet" – Amen… if I have to fix another site build within tables I’ll go crazy.

  12. Anonymous

    Yes, it’s always a good practice to validate websites after creation to check for syntax errors.

    You mentioned "Page should be readable without a stylesheet". I agree, but if a page doesn’t link to or import its related CSS file, it looks ugly and messy. Please, correct me if I’m wrong.

  13. Quality Directory

    I make sure all the sites I develop are validated and they have no syntax error.

  14. Anonymous

    I find it hard to validate a website not being a web designer, anyone want to assist.

  15. Anonymous

    Hi Friend,! Congratulations for this nice looking blog. In this post everything about Web Development. I am also interested in latest news, Great idea you know about company background. Increasing your web traffic and page views <a href="directory.itsolusenz.com/…/a>, add your website in http://www.directory.itsolusenz.com/

  16. Anonymous

    Yes,There should be h1 tag for one time.But you can place h2 and h3 tags many times.If you give the hierarchy ist h1 then h2 and at the last h3.It will be more beneficial.

Comments are closed.