Questions
-
Googlebot on steroids... Why?
We got an answer from JohnMu - Webmaster Trends Analyst at Google. The reason of crawling is (as we find out) the filters which have infinite variations (one of developers was sleeping), we will correct this. Disallowing in Robot.txt is adviced as the quickest fix to stop the mega-crawling. This case will be used for further research because of the disproportionate capacity usage. You're right, Google initially will crawl everything, but they don't want Googlebot crawling looks like a "mini-Ddos-like attack".
Intermediate & Advanced SEO | | Olaf0 -
CNAME instead of A-record: seo problem?
If i m not mistaken #1 will apply. it will fully resolve
Intermediate & Advanced SEO | | wissamdandan1