So, how does this Googlebot actually work?
Rolf Broer, who works at Onetomarket, just released a very interesting article about Googlebot and its crawling capabilities. Rolf created several tests for the so called superior bot of Google to see how it reacts on particular setups of links and pages. Biggest takeaway is that you can use a Google sitemap to boost the crawlrate. But there is more...
Google's statement about the fact that you shouldn't use more than 100 links per page doesn't make it. The test showed that Google will crawl way over 100 links if it has to. Also Matt Cutts statement that "the amount of pages being crawled is roughly proportional to your PageRank" seems to be a tad of course. In 31 days, Googlebot visited about 375.000 pages on a PageRank 0 website. Rolf declares that when the website would have had a PageRank of 1, Googlebot would crawl over 140.000.000.000 pages in 31 days. That simply means that PageRank doesn't matter at all to get your pages crawled.
Tagcloudgoogle tools sem mobile marketing microsoft baidu adwords social media ads blog research searchcowboys matt cutts analytics business twitter smx ses website spain news google earth blogger social yandex advertising linkbuilding apple russia app europe search engine streetview search engines facebook privacy internet realtime london search