SMX East 2008: Unraveling URLs and Demystifying Domains

This is the second of three posts covering our presentations at SMX East last week. URLs are the foundation of the Internet. However, they can cause some significant problems for search engines due to the number of synonyms that are often automatically created for each piece of content. If there was only one thing I wanted the audience to take away from this presentation, it was that they should always create short, descriptive URLs, and redirect...
Read More

SMX East 2008: Webmaster Guidelines

Updated: SMX has posted the video for the What is Spam session.  One of the most common questions I get from companies concerned about search engine optimization (SEO) is which optimization tactics are acceptable by search engines and which ones are not. We pulled this session together with the help of Danny Sullivan and SMX to provide a definitive answer to that question and help clarify any misconceptions that the audience might have. This...
Read More

Is your robots.txt file on the clock?

Just recently a strange problem came across my desk that I thought was worth sharing with you. A customer notified us that content from a site she was interested in was not showing up in our results. Wanting to understand why we may or may not have indexed the site, I took a look to see what the problem was and stumbled upon an interesting but a potentially very bad use of the robots.txt file. The first visit I made to the site had a very standard...
Read More

What's broken in the Microsoft development stack?

Last week I was at SES San Jose in the “Working Collaboratively with Your IT Dept” session with panelists Greg, Matt, Chris, and Sage, where someone asked which platform they should use for maximum SEO benefit. The answer from the panel was a resounding “Nothing from Microsoft.” While I don’t entirely agree with this advice, I thought it would be a great catalyst for feedback from the community on what we can do to...
Read More

Find out how Live Search is crawling your site

My favorite feature of our recent launch is the Crawl Issues tool, which gives you details about issues Live Search may encounter while crawling and indexing your website. This information can help you better understand what Live Search sees when crawling your site and should ultimately help you improve your results from Live Search. We report four types of issues: File not found (404) errors – reported when Live Search encountered a...
Read More

Big update to Webmaster Center tools

Creating websites and publishing on the web is getting simpler and simpler all the time, but with more content on the web it is becoming harder for webmasters and publishers to ensure that their content can be found. Last fall when we launched the Live Search Webmaster Center in beta, the goal was to establish a long term relationship with webmasters and help them achieve their goals by addressing the most common questions we hear, and help them...
Read More

Reach nearby customers with Live Search Local Listing Center

Searchers use Live Search Local to quickly find local businesses. As the owner of a business, you have the ability to update the information shown to searchers, including your business address, telephone number, and customer ratings, all with the click of a button. How do you find the Live Local Listing Center? Easy, go straight to http://llc.local.live.com or find the link on the Webmaster Center (http://webmaster.live.com) titled Business...
Read More

Robots Exclusion Protocol: joining together to provide better documentation

As a member of the Live Search Webmaster Team, I'm often asked by web publishers how they can control the way search engines access and display their content. The de-facto standard for managing this is the Robots Exclusion Protocol (REP) introduced back in the early 1990's. Over the years, the REP has evolved to support more than "exclusion" directives; it now supports directives controlling what content gets included, how the...
Read More

More crawling improvements from MSNBot

A few months ago we announced two new features to MSNBot to reduce the burden of crawling on your website. These were part of a series of improvements we’re making to our crawler during the Spring to increase the freshness and breadth of content in our index. As part of these latest improvements, you may notice an increase in the amount of traffic from MSNBot starting over the next couple weeks. If you notice any issues with MSNBot, please...
Read More

Microsoft to support cross-domain Sitemaps

Today we’re pleased to announce an update to the Sitemaps Protocol, in collaboration with Google, and Yahoo! This update should help many new sites adopt the protocol by increasing our flexibility on where Sitemaps are hosted. Essentially, the change allows a webmaster to store their Sitemap files just about anywhere, using a reference in the Robots.txt file to establish a trusted relationship between the Sitemap file and the domain or...
Read More