A few months ago we announced two new features to MSNBot to reduce the burden of crawling on your website. These were part of a series of improvements we’re making to our crawler during the Spring to increase the freshness and breadth of content in our index. As part of these latest improvements, you may notice an increase in the amount of traffic from MSNBot starting over the next couple weeks. If you notice any issues with MSNBot, please make sure to drop us a note on our Crawling Feedback & Discussion Forum so we can investigate.
This is a great time to take a look at your robots.txt file (and meta tags) to make sure that you are not inadvertently blocking robots from content on your site you may want indexed. Also, if you feel that MSNBot is crawling your site too frequently, you can use the crawl delay directive in robots.txt. Please refer to the MSNBot support page for more information. Here are a few recommended settings:
Slow (wait 5 seconds between each request)
Really Slow (wait 10 seconds between each request)
Note that setting the crawl delay reduces the load on your servers, but it also increases the amount of time it will take MSNBot to index your website (proportional to the length of the delay), and possibly make it more difficult for your customers to find your site on Live Search.
Another great way to reduce the impact of MSNBot on your website is to enable HTTP Conditional GET and HTTP Compression as outlined in our prior blog post.
–Nathan Buggia, Live Search Webmaster Center