Microsoft to support cross-domain Sitemaps

Today we’re pleased to announce an update to the Sitemaps Protocol, in collaboration with Google, and Yahoo! This update should help many new sites adopt the protocol by increasing our flexibility on where Sitemaps are hosted.

Essentially, the change allows a webmaster to store their Sitemap files just about anywhere, using a reference in the Robots.txt file to establish a trusted relationship between the Sitemap file and the domain or folder.

Here’s how it works: Say you run a web site like, which has a bunch of sub domains like, and And, due to a technical requirement, you would like to host all of your Sitemaps in one location like Until now the protocol did not support this scenario, each Sitemap would have needed to be hosted directly under the domain it described. This update now introduces support for this scenario, with the requirement that you simply include a reference to the Sitemap in your Robots.txt file. For example, would need to include this line:


The catch is that all the URLs in the Sitemap file all need to be within the same domain as the robots.txt file (i.e.* in this example). Note that this applies equally for Sitemap index files and for compressed files.

Here are a few other useful notes about our implementation:

  • We support multiple “Sitemap:” references in your robots.txt files
  • We recommend you limit the size of your robots.txt file to less than 1 MB
  • If multiple Sitemaps for a domain include the same URL with conflicting metadata (i.e. priority, change frequency, etc.), we will disregard the metadata and just look at the URL.
  • Individual Sitemap files should never be larger than 10 MB when uncompressed. This includes all Sitemap file formats: XML, RSS, and text.
  • You can upload your Sitemap in our Webmaster Center tools
  • You can ping us with updates to your Sitemap using our Ping URL:[your sitemap web address]

This change comes directly from feedback we received from webmasters, thank you for helping us improve our product! If you have any additional feedback or questions, please check out our Sitemap Discussion forum.

–Fabrice Canel, Program Manager, Live Search Crawler

Join the conversation

  1. Anonymous

    Finally, good news in the sitemap world! The way it is now, the lack of standardization makes our webmasters daily lives a big mess. Thanks God you finally addressed this issue seriously.

  2. Anonymous

    I agree with above author, it is always a hassle to get the sitemaps working for all different search engines, I am looking forward to this new standard.

  3. Anonymous

    Great news. This was a serious issue that Microsoft have to face

  4. Anonymous

    Microsoft: Thanks for submitting your sitemap.

    Great job! That is the best convenience! Thanks!

  5. Anonymous

    Great news, i think live will got a better market share in future and more people will use live search.

  6. Anonymous

    I have a question want to ask.

    My site have over 200,000 pages, so i divide it to  ten xml-sitemaps, but at live webmaster, i just can upload one sitemap.

    At this time, should i make an index-sitemap.

    And when i upload my sitemap to live webmaster, i only need upload the index-sitemap?

  7. Anonymous

    Great improvement, thanks. This has made webmaster life a lot easier.

  8. Anonymous

    Oh, I think it is the best way to submit sitemap and robot.txt files.

    Thanks so much! It saves my time and memory.

    Great! Thanks again!

  9. Anonymous

    Thanks for support multiple sitemaps!  My site have over 15000 pages.

  10. Anonymous

    Excellent improvements, good to see the older bugs fixed

  11. Anonymous

    Thanks for support multiple sitemaps!   my url over 5000.

  12. Anonymous

    great improvements, hope to see more advanced function.

  13. Anonymous

    It’s more amazing for webmaster

  14. Anonymous


    My site from 2007 till now, it’s over 5000 visits per day.

  15. Anonymous

    I just have to congrat for Live Search!

    it´s too good as Google webmaster tools, so many great functions.

  16. Anonymous

    I create individual sitemap for every sub domain the old-fashioned way. And I have no need for this update presently. However, it’s a great enhancement.

  17. Anonymous

    This improvement is good. I’m glad I can now host all my sitemaps in one directory, unlike before.

  18. Quality Directory

    I implement my sitemaps the old-fashioned way. I don't need to change to this new protocol.

  19. Anonymous

    I think it is the best way to submit sitemap and robot.txt files.

  20. Anonymous

    Congratulations everyone. Very nice job microsoft did with bing. Thanks. Best regards.

  21. Anonymous

    index files and for compressed files.

    Here are a few other useful notes about

Comments are closed.