SMX East 2008: Webmaster Guidelines

Updated: SMX has posted the video for the What is Spam session. 

One of the most common questions I get from companies concerned about search engine optimization (SEO) is which optimization tactics are acceptable by search engines and which ones are not. We pulled this session together with the help of Danny Sullivan and SMX to provide a definitive answer to that question and help clarify any misconceptions that the audience might have. This is the first of three posts we’ll be doing covering our presentations at SMX East last week.


Q: What is a "paid link" and is it okay to buy them?

At Live Search, we recommend against investing in buying or selling paid links because they likely will not provide much revenue back to your site. This is especially true when compared to other opportunities to invest in new and unique content that your audience would find compelling.

A paid link could occur in a variety of scenarios whereby one person pays another to place a link surreptitiously within their website with the goal of causing a search engine to follow this paid link and increase the reputation of the destination page.

The issue isn’t that you are selling links on your website. In many of these scenarios, however, it appears as though folks are using these paid links in an attempt to manipulate our ranking algorithms to improve the status of their website by purchasing links that a search engine might be fooled into believing are real endorsements.

There is a wide variety of paid link types. The straightest forward are those listed under “Sponsored Links” sections on websites. Other types of paid links might include those appearing in articles where the author was paid to write a product review, or even in some online directories.

The way Live Search addresses this issue is by attempting to assess the value of every link we find. Essentially we look at each link individually to understand the degree to which the site is really endorsing the link. So, while we most likely will not ban your site for buying or selling a few links, it is also likely that they may not actually end up providing any value either.

A simple way to fix this is to label all paid links within your website with the rel=NOFOLLOW attribute, or use the Robots Exclusion Protocol (REP) to block access of the destination URL. This will tell search engines that you are linking to this page, but you are not necessarily endorsing them.

Q: Is cloaking an acceptable practice for search engine optimization?

Cloaking is another area of search engine optimization where we see a lot of confusion. First of all, cloaking is defined as when a website shows one set of content to a search engine, but a different set of content to a user. This is a tactic often used by spammers to attempt to manipulate what the search engine believes the page is about. All search engines specify this tactic in their webmaster guidelines as highly un-recommended. From a search engine standpoint, we want to ensure the pages we are showing to our customers in the search results are the same ones we indexed to guarantee the highest quality results.

The bottom line is there are no scenarios in which we would ever recommend cloaking as a good solution, although we do understand that there are some technical reasons people cloak pages that are not directly related to spam. The problem is that cloaking can set off some automatic spam detection algorithms that may result in parts of your site being penalized. As a search engine optimization practice, cloaking can actually be counter-productive.

Q: What can you do if your website does get penalized?

The first thing you should do is verify that your site has in fact been penalized. To do this, log into our Webmaster Tools and go to the Site Summary page. From here, looked for the Blocked: field in the right-hand column. If your site is blocked, this will show as Yes, otherwise it will show as No.



If your site is blocked, then it is time to go review our Webmaster Guidelines and check your site to see which one(s) you may have violated. If you have any questions about this step, please consult our online forums, or contact our technical support staff.

Once you’ve identified and resolved the issue(s), it is time to request that Live Search re-include your pages back into its index. To do that, you’ll need to log back into the Webmaster Tools and click on the hyperlinked Blocked: Yes in your Site Summary page. This will take you to a form whereby you can request reevaluation from our support team.

Thanks for all of your questions today! If you have any more, please leave them in the comments section and we’ll try and answer them as soon as possible.

— Nathan Buggia, Live Search Webmaster Team

Join the conversation

  1. Anonymous

    It seems the biggest issue is with directory sites and sites that only sell ad space for link backs. How do you distinguish between sites that are not performing "black hat" tactics and those that are just trying to find additional revenue streams from advertising.

    It would be nice to see these link bait and bogus websites disappear as they only hurt productivity and the value of results…. okay guess I gotta get back to work.

  2. rickdej

    @Web Design – Chicago, We can’t necessarily tell you everything we do to manage these types of sites, but we do have super-secret algorithms that identify and approprately manage many millions of these sites every day.

  3. Anonymous

    First of all the webmaster account is a great addition. Thanks for that.

    It does gives really great info a webmaster can capitalize on.

  4. Anonymous

    I fully agree with the first author that it would be nice if all the bogus can be filtered out. This will help all the users to achieve their goal quicker which you in the end want from a search engine.

    Thanks for the elaborate post.

  5. Anonymous

    The most important aspect in spamdexing is fast response from search engine antispam teams to spam reports. There is a brazilian site, for instance, which made extensive use of invisible text for more than 3 months before it was penalized by Google only, while Live Search and Yahoo kept it in their indexes. Less than 1 month later, they were back to Google’s index and, 2 months later, there are they on the first page, stil using invisible text but, this time, on a cloaked page anyone can see on Google cached version!

    In the meanwhile, both Yahoo and Live Search did absolutely nothing to penalize this site.

    I mean, are REALLY SERIOUS about antispam enforcing? What’s the use of all those "report spam" forms if it can take a lifetime to see a response? The company behind this site is in a very competitive and expensive market and they are forcing their honest competitors to spend buckets of money in sponsored links while they rank high in organic results in all three major search engines.

    If they stay at Google, Yahoo and Live Search organic results first page for one month only, they will save many thousands of dollars. So, any delay in search engine penalization will favour the dishonest against those who are trying fight a fair competition!

  6. Anonymous

    Nice article! However, I would like to know how Live Search sees pay-for-review web directories, where payment for review doesn’t guarantee inclusion. Does Live Search see it as wrong the same way it doesn’t approve of pay-for-inclusion (paid) directories?

  7. Quality Directory

    I hear a lot about white hat and black hat SEO techniques. it's very important to me that I don't practice black hat SEO.

  8. Seattle SEO

    I've never tried any black hat tactics, I figure why waste the time when the site will just banned.

Comments are closed.