So, how does this Googlebot actually work?
Rolf Broer, who works at Onetomarket, just released a very interesting article about Googlebot and its crawling capabilities. Rolf created several tests for the so called superior bot of Google to see how it reacts on particular setups of links and pages. Biggest takeaway is that you can use a Google sitemap to boost the crawlrate. But there is more...
Google's statement about the fact that you shouldn't use more than 100 links per page doesn't make it. The test showed that Google will crawl way over 100 links if it has to. Also Matt Cutts statement that "the amount of pages being crawled is roughly proportional to your PageRank" seems to be a tad of course. In 31 days, Googlebot visited about 375.000 pages on a PageRank 0 website. Rolf declares that when the website would have had a PageRank of 1, Googlebot would crawl over 140.000.000.000 pages in 31 days. That simply means that PageRank doesn't matter at all to get your pages crawled.
Tagcloudrealtime marketing internet yandex spain matt cutts russia yahoo facebook sem seo app a4uexpo social blogger website ses ads mobile interview advertising linkbuilding adwords google earth business china baidu tools apple ppc images smx search engine viral streetview searchcowboys maps privacy iphone search engines