Now Google Adds SiteMaps in robots.txt of All Blogger Blogs

Google is continuously improving the Blogger platform - their latest feature discussed below may help get your blog content on search engines more quickly.

For non-geeks: Sitemap files enable you to let search engines know about the new or updated content on your blog that should be indexed by their crawlers. Earlier you could point Google to your Blog's SiteMap file through the Google Webmasters control panel but that's no longer required.

Now whenever your publish a new blog post on Blogger or update some previous post, that information will automatically become available to Google and other search engines - you don't have to do anything at your end. How cool is that.

For geeks: Google has now added the Sitemap: directive to the robots.txt of all blogspot blogs which actually points to the default xml feed of that blog containing the 25 most recently updated blog posts. Here's an example:

User-agent: *
Disallow: /search
Sitemap: http://abc.blogspot.com/feeds/posts/default?orderby=updated


This is an excellent development because the Sitemap: sytax is supported by all major search engines including Google, Yahoo! and Windows Live - their search crawlers can therefore auto-discover your new or updated blog content based on the sitemap file.

Source

Comments