While checking out Google's Webmaster Tools, I noticed something rather interesting about sitemaps for Blogger:
So I checked it out for myself, and I'm happy to say that it works! What's more, other search engines can easily find your sitemap too...
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
So how does it work?
Blogger generates a "robots.txt" file for every blog automatically. This tells search engines which directories they are disallowed from, and includes the location of your sitemap. When your site is crawled by search engines, they find the location of your sitemap from the robots.txt file, and are able to update their databases with the posts they find here.
The robots.txt file for Blogger Buster looks like this:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Sitemap: http://www.bloggerbuster.com/feeds/posts/default?orderby=updated
As you can see from this example, Blogger use the default feed (ordered by the last post update) as the sitemap for your blog.
This helps search engines find your most recent posts. It's quite useful as this means you don't need to fiddle about telling Google and other search engines about your sitemap.
You can take a look at the robots.txt file generated for your own blog by going to http://yourblogname.blogspot.com/robots.txt, substituting "yourblogname" for that of your own blog. If you use a custom domain for your blog, this will be located at http://www.yourblogname.com/robots.txt instead.
I'm happy to know that Blogger now include this sitemap in the robots.txt files for our blogs. It certainly saves us a lot of time trying to submit sitemaps to other search engines!
0 comments:
Post a Comment