So, how does this Googlebot actually work?
Rolf Broer, who works at Onetomarket, just released a very interesting article about Googlebot and its crawling capabilities. Rolf created several tests for the so called superior bot of Google to see how it reacts on particular setups of links and pages. Biggest takeaway is that you can use a Google sitemap to boost the crawlrate. But there is more...
Google's statement about the fact that you shouldn't use more than 100 links per page doesn't make it. The test showed that Google will crawl way over 100 links if it has to. Also Matt Cutts statement that "the amount of pages being crawled is roughly proportional to your PageRank" seems to be a tad of course. In 31 days, Googlebot visited about 375.000 pages on a PageRank 0 website. Rolf declares that when the website would have had a PageRank of 1, Googlebot would crawl over 140.000.000.000 pages in 31 days. That simply means that PageRank doesn't matter at all to get your pages crawled.
Tagcloudlondon a4uexpo sem google wave privacy yahoo apple smx business china browser streetview twitter baidu app blogger sea images analytics ppc search engines mobile social event research interview marketing ses google blog adwords maps linkbuilding europe search microsoft news russia google tools youtube