Is your robots.txt file on the clock?

Just recently a strange problem came across my desk that I thought was worth sharing with you. A customer notified us that content from a site she was interested in was not showing up in our results. Wanting to understand why we may or may not have indexed the site, I took a look to see what the problem was and stumbled upon an interesting but a potentially very bad use of the robots.txt file. The first visit I made to the site had a very standard...
Read More

What's broken in the Microsoft development stack?

Last week I was at SES San Jose in the “Working Collaboratively with Your IT Dept” session with panelists Greg, Matt, Chris, and Sage, where someone asked which platform they should use for maximum SEO benefit. The answer from the panel was a resounding “Nothing from Microsoft.” While I don’t entirely agree with this advice, I thought it would be a great catalyst for feedback from the community on what we can do to...
Read More

Diagnose SEO Issues - SES San Jose Presentation

Thank you all for coming to our presentation at SES San Jose. We’ve posted the deck online for folks who are interested. If you have any questions, please post them to our forums. Diagnose SEO Issues with Live Search Webmaster Tools View SlideShare presentation or Upload your own. (tags: seo) — Nathan Buggia, Webmaster Team
Read More

See you in San Jose!

As summer begins to wind down and kids are beginning to dread the sound of school bells ringing, there is one more chance for a little getaway: Search Engine Strategies, San Jose 2008. While we expect the California sun will be out and there will be plenty of margarita’s to go around, there will also be some really great sessions and speakers that those of you in search marketing won’t want to miss. So if you’re planning to be...
Read More

Making backlinks actionable again

In 2007, we shut off the linkdomain attribute in the advanced query syntax, but promised to make link data available to you as soon as possible. Last fall when we launched the initial beta of the Live Search Webmaster Center, we offered a limited look into backlink data. But we soon realized that for you to be successful, you really need more and better backlink data. That’s why we’re really excited about the updates that we a have...
Read More

Find out how Live Search is crawling your site

My favorite feature of our recent launch is the Crawl Issues tool, which gives you details about issues Live Search may encounter while crawling and indexing your website. This information can help you better understand what Live Search sees when crawling your site and should ultimately help you improve your results from Live Search. We report four types of issues: File not found (404) errors – reported when Live Search encountered a...
Read More

Big update to Webmaster Center tools

Creating websites and publishing on the web is getting simpler and simpler all the time, but with more content on the web it is becoming harder for webmasters and publishers to ensure that their content can be found. Last fall when we launched the Live Search Webmaster Center in beta, the goal was to establish a long term relationship with webmasters and help them achieve their goals by addressing the most common questions we hear, and help them...
Read More

Reach nearby customers with Live Search Local Listing Center

Searchers use Live Search Local to quickly find local businesses. As the owner of a business, you have the ability to update the information shown to searchers, including your business address, telephone number, and customer ratings, all with the click of a button. How do you find the Live Local Listing Center? Easy, go straight to http://llc.local.live.com or find the link on the Webmaster Center (http://webmaster.live.com) titled Business...
Read More

Robots Exclusion Protocol: joining together to provide better documentation

As a member of the Live Search Webmaster Team, I'm often asked by web publishers how they can control the way search engines access and display their content. The de-facto standard for managing this is the Robots Exclusion Protocol (REP) introduced back in the early 1990's. Over the years, the REP has evolved to support more than "exclusion" directives; it now supports directives controlling what content gets included, how the...
Read More

Increasing customer engagement with custom 404 error pages

If you’re the owner of a large website with lots of content, you’ve probably noticed that up to 10% of your traffic ends up on a “webpage not found” error page due to broken links or misspelled URLs. There are a lot of reasons users who visit your site might reach a 404 page, but how do you keep those customers from abandoning your site? Today we’re announcing the Web Page Error Toolkit, a customizable web...
Read More