It wasn’t that long ago that I discussed in this blog how to create good content that will get your site noticed, by both end users and search engines. But to be clear, just writing some slick text is not the whole story. The previous blog articles in the Site Architecture and SEO series (files/pages and link/URLs) made reference to doing what you can to help the search engine web crawler (also known as a robot or, more simply, a bot) crawl...
Read More
Links can be the lifeblood of a good website, as we discussed in Part 1 and Part 2 of Links: the good, the bad, and the ugly. But how well you manage them on your site from a site architecture perspective can be the difference between your website being starved for oxygen (aka search engine referral traffic) versus healthy and thriving. That’s why we do search engine optimization (SEO).
This article is part 2 of the recent Site Architecture...
Read More
Search engine optimization (SEO) has three fundamental pillars upon which successful optimization campaigns are run. Like a three-legged stool, take one away, and the whole thing fails to work. The SEO pillars include: content (which we initially discussed in Are you content with your content?), links (which we covered in Links: the good, the bad, and the ugly, Part 1 and Part 2), and last but not least, site architecture. You can have great...
Read More
In my previous post on developing your keyword list, we discussed techniques and considerations for developing a list of relevant keywords and key phrases for the pages on your website that are highly specific and (hopefully) not overly competitive. All of this is effort is geared toward making your website stand out from the competition in search engine results pages (SERPs).
But developing a list of great keywords is just the start. You now...
Read More
One of the best parts of publishing online is that, on the Web, anyone can have a world-wide reach. But while being global is made easy on the Internet, ensuring that the content you produce will be found by the right audience can be a real challenge. Search engines can have trouble understanding geotargeting because of a few technical limitations. These include:
Search engines may not be crawling your site from the location of your customers...
Read More
One of the most common challenges search engines run into when indexing a website is identifying and consolidating duplicate pages. Duplicates can occur when any given webpage has multiple URLs that point to it. For example:
URL
Description
http://mysite.com
A webmaster may consider this their authoritative or canonical URL for their homepage.
http://www.mysite.com
However, you can add ‘www’ to most websites and still get the...
Read More
Working with large sites often means being a part of a large organization, which brings its own set of challenges. Many stakeholders with different agendas or needs influence how sites are structured. Within larger organizations, there are long to-do lists and a lack of understanding of the impact certain designs or architecture choices can have on the ability of the search engine to index the site. In our past two articles on large site...
Read More
At Live Search, one of the most common questions we receive from our peers at microsoft.com and msn.com is how to optimize their sites for search. But microsoft.com is unlike most other sites on the Internet. It is huge, containing millions of URLs, and is growing all the time. However, large content sites like microsoft.com and msn.com are not the only sites that can have an infinite number of URLs. There are also large ecommerce sites and...
Read More
Last week at PubCon, I spoke on a panel titled Video Engines – New Kids Rocking the Web. The discussion focused on what was new in the online video space in production, optimization, and search. For those that weren’t there we felt it would be good to share the presentation and some additional information on video search. Be sure to run some of the demo queries you can find in the presentation below.
Video Engines – New Kids...
Read More
A big part of Web 2.0 is bringing the richness of desktop applications onto the Web, to create both a richer user experience and a stable development platform. With technologies like Asynchronous JavaScript And XML (AJAX), Adobe Flash and Microsoft Silverlight, we’ve made a lot of progress. As far as the Web has come, these technologies are still far from mature in realizing the promise to give modern developers and users the best of both...
Read More