Questions
-
Fetch as Google Desktop Render Width?
Clever PhD hit the nail on the head his answer Is excellent.
Technical SEO Issues | | BlueprintMarketing0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
The additional massive complexity, expense, upkeep and risk of trying to run a separate server just for bots is nowhere near worth it, in my opinion. (Don't forget, you'd also have to build a system to replicate the content between each server every time content/code is added or edited. That replication process could well use more resources than the bots do!) I'd say you'd be much better off using all those resources towards a more robust primary server and let it do it's job. In addition, as Lesley says, you can tune GoogleBot, and can actually schedule Bing's crawl times in their Webmaster Tools. Though for me, I'd want the search engine bots to get in and index my site just as soon as they were willing. Lastly, it's only a few minutes' work to source a ready-made blacklist of "bad bots" useragents that you can quickly insert into your htaccess file to completely block a significant number of the most wasteful and unnecessary bots. You will want to update such a blacklist every few months as the worst offenders regularly change useragents to avoid just such blacklisting. Does that make sense as an alternative? Paul
White Hat / Black Hat SEO | | ThompsonPaul0