Posts Tagged ‘googlebot’

Sitemaps for your blog

November 2, 2010 Leave a comment

There’s a great question, which is really an observation on that points out that sitemaps for your blog or website are probably unnecessary. The argument is that sitemaps are supposed to help web crawlers like Googlebot find pages that aren’t linked to on your site. But if pages aren’t linked to by anyone, they won’t have any pagerank and won’t appear in the search results anyway. So the proper way to ensure Google indexes all pages on your site is to ensure you have a healthy link structure and that all pages have another page on your site linking to them. has a fun diagram showing who is suing who in the telecoms industry. Be thankful you’re not part of that dogfight. If your data is living in the cloud, Amazon have reduced their prices for data storage on S3. At Feedjit, we buy our servers and amortize them over 3 years because it’s more cost effective that way. If you’re looking for cheap hosting, check out (my personal favorite) or for an entry level Linux server. Finally, today’s award for toughest bloke ever goes to this chap who saved a woman from a great white shark in Australia by grabbing the shark by the tail – and then refused to speak to the press about it. I’ll be publishing the Daily Feed on a daily schedule once again for the rest of this week. Have a spectacular week! Mark Maunder Feedjit Founder & CEO