Cool tips and hot tricks for the new Bing Webmaster Tools, Part 1

I trust by now everyone has heard the big news: the new Bing Webmaster Tools are here! I have been working with it for quite some time now. I like the new look and feel of the Silverlight-based user interface. It is clean, intuitive, and easy to use.

As the primary contributor to the tool’s documentation, I’ve had some interesting discoveries as the tools were developed, and I thought I’d take this opportunity to highlight some interesting features and share some cool tips and tricks that I’ve learned. If you are already a registered user, sign in to Bing Webmaster Tools and follow along. If you haven’t signed up yet, what’s holding you back?

Note: The features I discuss here are based on the Silverlight version of the Bing Webmaster Tools. While there is a basic, down level version of the tools available, the Silverlight version offers greatly enhanced functionality and is the recommended way to get the full effect of the tool’s rich index data offerings and advanced features for your website.

Site registration

Bing Webmaster Tools allows you to register sites based on the specific site branch you manage.. If you own the site’s domain, then register the site at the root. But if you work for a very large website and manage only a subsection of the site, you can register just that subdomain or subdirectory. This feature enables enterprise site managers to delegate site management responsibilities through the tools to specific people or teams, ensuring that any changes made in crawler settings only affect the portion of the site they manage. To make this work, just place the Bing-provided ownership verification code at the registered location (the registered root) and you’ll be good to go!

A great benefit of registering a site with Bing Webmaster Tools is that we guarantee the default page of any site you register will be added to the index and will always remain in the index. That alone is reason enough for many smaller sites to sign up!

Note: For sites that are newly registered with Bing Webmaster Tools, there may not be any data available in the charts immediately after registration. The tools’ servers need to extract the initial data from the index servers and begin the data collection effort for your site. If you initially see a “Not available” message in some data fields, rest assured that you’ll start seeing data there within approximately 72 hours. If after that time you still have any questions or concerns, feel free to post a question with details of your situation to the Webmaster Tools & Feature Requests forum. A member of the Webmaster Center team will look into the matter for you.

Verification code

To use the Bing Webmaster Tools, you need to verify you own the registered website by adding a Bing-provided ownership verification code. Bing provides this when you go through the Add Site process. If you choose to use the downloadable XML file method, just click Option 1 in the Verify Ownership dialog box and click BingSiteAuth.xml in Step 1 to download the file from Bing, but then leave that dialog box open. Once downloaded, simply upload that XML file to the registered root of your site (please see the previous tip for what is meant by registered root!). Once the upload is done, go back to the open Verify Ownership dialog box so you can test to see if Bing sees your newly uploaded verification file. Click the link to the expected location of your uploaded verification file listed in Step 3 of Option 1 in the Verify Ownership dialog box. If you see the XML file appear in your browser, you’re golden.

Veteran users of our webmaster tools should note that sites previously registered with Bing Webmaster Center using their original verification codes, placed in either a <meta> tag on the default page or stored in the LiveSearchSiteAuth.xml file, are still supported, so no change is required to use the new tools today. However, the new tool does issue new verification codes for newly added sites, and the XML-based ownership verification file has been renamed to BingSiteAuth.xml. The new <meta> tag code or verification file must be used for all new sites, even with existing accounts.

To keep things working smoothly, do not swap the verification codes between the old and new XML files or <meta> tags. If you must update all of your registered sites with the new verification code, you can force your old sites to use the new code by removing them and re-adding them on the Home page. Just remember, existing sites should use the codes they were previously assigned, and newly added sites should use the new code. The new code will be consistent for all new sites you add to your account going forward.

Working with charts

Silverlight enables us to bring you several new charts of useful information. But these charts aren’t just static images. You can interact with them as well. The charts in the Crawl Summary, Index Summary, and Traffic tools all offer x-axis slider controls so you can adjust the data’s amount of time and the portion of the timeline shown. Clicking and dragging the chart’s sliders allows you to zoom in or out to the granularity you want to see. As Bing collects more data over time for registered sites, individual sites will have access to up to six months of historical data.

And while you’re at it, note that as you pause your mouse pointer over the data lines in any of the charts (including those in the Dashboard tool), individual points in time are shown and the related data details are revealed in the chart’s upper left corner.

Select items in a list

Several of the tools contain lists that you can add items to or remove items from, including registered sites in the Home tool, submitted Sitemaps in the Sitemaps tool, and URL blocks in the Block URLs tool. To remove an item from these lists, just pause your mouse pointer over the line in the list you want to delete to reveal a check box in the left side of the line. The check boxes enable you to perform actions on multiple items at the same time. Select the check boxes of the list items to delete and then click the tool’s version of a Remove button.

Dig into Index Explorer

The Index Explorer tool on the Index tab is one of the key components of the new Bing Webmaster Tools. Let’s check out some of the cool things you may not realize you can do to your indexed URLs with this tool:

  • Copy them. Index Explorer enables you to examine just the portion of your site’s indexed content you want to see by changing the base URL in the Directory box. But instead of manually typing out a potentially long URL, copy and paste it! Select the URL you want copied, and then right-click it to reveal the Copy URL option. For keyboard fans, you can also use the tried-and-true CTRL+C. Once copied, paste the copied URL in the Directory box with CTRL+V.
  • Recrawl them. If you just updated an indexed page and want it recrawled ASAP, find the page in Index Explorer, and then click it to display the Page Details dialog box. Click Recrawl URL to request an expedited recrawl of the page to update the index.

    Note: While this feature is really cool, it’s not limitless. Bing does set quota limits on the number of URLs that can be submitted. You can submit up to 10 URLs per day and up to 50 per month. The quotas are in effect for entire domains. Multiple webmaster accounts for the same domain cannot combine their quotas to get more URL submissions. Bing also disallows submitting redirected pages to a site already at full quota.

  • Block them. Do you have indexed content you want to stop from showing up in the Bing search engine results pages (SERPs)? Set up a block. To do so, find the content in Index Explorer and then click it. In the resulting dialog box, you can either block the content from showing up in the search cache (click Block cache to block the appearance of the Cached Page link) or block all references to the content (click Block URL and Cache to block both the Cached page link and the URL’s blue link) from appearing in the Bing SERPs. You can set up blocks for individual pages, directories, or even the entire site. (You can later reverse a blocked URL as well, but you’ll need to go to the Block URLs tool in the Index tab to do that).
  • Filter them. One of the coolest things you can do with Index Explorer is apply one or more filters simultaneously to narrow down the list of indexed content to just those matching specific index data conditions, such as HTTP codes received, crawl dates, and much, much more. That said, while you can look up crawler-detected problems in Index Explorer, it’s even easier to do with the Crawl Details tool on the Crawl tab.
  • Date them. Want to see when a specific page in the index was last crawled, what its HTTP code was, or how big the file is? Find the page in Index Explorer and then click it to display the Page Details dialog box.

By the way, if your filtered search results in a “No data available” message, clear away one or more of the filters and click Apply filters again. Double-check what’s listed in the Directory box as well, as you may not be searching from the registered root (fix that quickly by clicking Reset filters).

I’m just getting started here. I’ll have more cool tips and hot tricks for the new Bing Webmaster Tools next time around. If you have any questions, comments, or suggestions, feel free to post them in our Webmaster Tools & Feature Requests forum. Stay tuned!

— Rick DeJarnette, Bing Webmaster Center

Join the conversation

  1. GerryWilliams

    I had a challenge that I do not understand.  For all of my sites except one the use of the meta: <meta name="msvalidate.01"… /> code worked.  For the site in question I had to use the xml file solution.  I am concerned that there may be a fundamental organic problem with this site with respect to spidering.  The site is ….

    any and all suggestions are welcomed.

  2. hydn79

    how do I add a sitemap?

  3. Slava Makhotkin

    It is ridiculous. They've used Microsoft's Silverlight for UI and Microsoft's IE8 just crashes at :)

    Firefox works fine, but man, they should really hire any UI-expert. Why on earth "Queries" table in Traffic tab is some small-only 4 rows in height! And if – surprise! – you have more, than 10 queries, you just spend all day scrolling this tiny table up and down. Was it too difficult to add a resize button at least?

  4. raphael

    Thanks for providing more detailed data and new features for webmasters.

    Why is Bing crawling and indexing URL restricted in robots.txt, with noindex meta tag and without external links pointing to them?

  5. Mike Thomas

    Such a nice article, i really appreciate the explanation of bing webmaster tool. I also use it and it's valuable resoures for me.


  6. DisgruntledGoat

    Interesting how you said "I like the new look…" and omitted the fact that none of the users like it :/

  7. simaan_huda

    How do you add a sitemap now? I want to update my sitemap but don't see the link anymore.

  8. rickdej

    I just updated the Verification code section of the post to clarify the issues surrounding the new verification codes and previously registered sites in our webmaster tools. Please review that revised content for questions pertaining to that issue. And thanks for the input!

    As for how to submit Sitemaps (which I will cover in the upcoming Part 2 of this post), the quick answer is to log in to the tools, click a registered site in the Home tool, click the Crawl tab, click Sitemaps in the left nav pane, click Add Sitemap, enter the Sitemap URL, and then click Submit.

    Thanks for writing, folks!

    Rick DeJarnette

    Bing Webmaster Center team

  9. rickdej


    Bing does not crawl URLs blocked by robots.txt, and it does not display URLs having the noindex <meta> tag. Pages that are excluded from crawling may still be in our index based on content we retrieved before robots.txt block was implemented and from content links gathered from other sources. These are the these pages you can see by using the “Excluded by robots.txt” filter in Index Explorer within the Bing Webmaster Tools. However, sometimes pages that are indexed and then later excluded from crawling via robots.txt and simultaneously blocked from indexing via the noindex <meta> tag are still indexed. This is because we cannot crawl the page to see the noindex <meta> tag to remove it.

    In the case with your site, rduhayon, URLs that were previously not blocked are temporarily redirected (302) to another page that contains a noindex <meta> tag, but because that redirected target page is excluded from crawl by robots.txt, the noindex <meta> tag cannot be read. Either unblock the redirected target page and make the redirect permanent (301) or put the noindex <meta> tag directly in the pages to be blocked. Either solution will resolve this situation for you.

    Thanks for writing! (And thanks to the engineers in the Bing Webmaster Center team who helped in this response!)

    Rick DeJarnette

    Bing Webmaster Center team

  10. cellhub

    Well thanks for this information.

  11. george12345

    Can anyone explain to me what is the use of the graphs.

    I can see my site is being crawled but why is there no information about which pages have been crawled or indexed.

    I can't see how any of these graphs help me to improve the visibility of my website.

    Looks pretty but seems to be a complete waste of time with information on backlinks, information on which pages have been indexed, dates for when each page listed in my sitemap was last crawled.

    I just don't understand why microsoft release software before it is ready. Do you like annoying your customers? Perhaps with bing webmaster tools you realize that we all have to use it whether we like ti or not and so you just don't care. No matter how pathetic it is we all know that there are a lot of people who don't know how to change their default search setting to the much more useful google or yahoo searches and therefore will use bing blissfully unaware that it crawls websites more slowly and therefore that the search results will not be as up-to-date.

    Very poor and given the hype hugely disappointing new release.

  12. GerryWilliams

    This is as clear as mud; 'swap' , 'removing them and re-adding them' (net effect is null) … English

    "To keep things working smoothly, do not swap the verification codes between the old and new XML files or <meta> tags. If you must update all of your registered sites with the new verification code, you can force your old sites to use the new code by removing them and re-adding them on the Home page. Just remember, existing sites should use the codes they were previously assigned, and newly added sites should use the new code. The new code will be consistent for all new sites you add to your account going forward."

  13. Quality Directory

    Ye, the new Tools is great improvement of the previous one, but I think the "backlink" feature needs to be added back. It is one of the mostly used features by webmasters.

    And the Webmaster Tools load very slowly because of heavy use of Silverlight.

  14. GerryWilliams

    re: my comment on .. meta: <meta name="msvalidate.01"… /> code

    Alan Mosley sent me a suggestion (he noted an HTML tag location problem).  I fixed the page but it appears that there is no way to re-authenticate a site without deleting it and submitting it again.  If I do that then I'll likely loose all of my accumulated data.  So, dear friends, I'll likely never know if Alan's suggestion fixed the authentication problem but I do know that my code is now cleaner.  Many thanks to Alan.…/default.aspx

  15. blogs98

    Bing bring a very nice thing, this could be beat the google.

  16. revotrad


      i've just launched a website so i've a few questions for you starting from first

    1 which verification process should i follow meta tag or other one

    2 How long does it will take to get listed in search engine.

  17. krwetatnt

    good service

  18. Puneet

    Very interesting information. how can i index whole pages of my site on bing?

  19. brad_ranks

    I just launched a site and I had set the sitemap up prior to publishing the product to the site and while it's great that bing crawled it quickly, it has never returned to reindex the sitmap or crawl any significant amount of pages. I have been religiously submitting pages at the throttled rate of 10 per day, but I would love to have a resubmit on the sitemap, instead of just a resubmit on the pages that have already been indexed, like the "contact us page and about us page… those don't make me any money   😉

    Please let me know if there is a way yo get into the recrawl queue.


  20. fish_raphaella

    excellent information.

  21. webdesignhouston

    Bing is best and great search engine…………i like Bing…

  22. IT-Blog

    Great that bing also have developed a Webmaster tools :-)

  23. m1chaelH

    Nice Functions, but why do you use Silverlight?

    No alternatives?

    Anyway, thanks for the Toolbox.


  24. David biedun
  25. sohel

    Amazingly insightful article. If only it was as easy to implement some of the solutions as it was to read and nod my head at each of your points

Comments are closed.