Can I prevent some pages from being crawled from SEOMoz spider and still not affect Google Spider?
-
Well, basically that's the question
Can I prevent some pages from being crawled from SEOMoz spider and still not affect Google Spider?This is, I have more than 10.000 pages on the website, and I am not interested in having reports for many of them, but I still wanna get SEO visits on them, so I want Google to crawl it easily...
Thanks!
-
Hi Matt,
Absolutely, you could do this by adding a special part to your robots.txt file specified to the user agent rogerbot which is used by SEOMoz for their spider. This could be done with for example:
User-agent: rogerbot Disallow: */anythingyouwanttoexcludeforroger/*ÂHope this helps!
-
Hey Martijn,
Thanks!
What about if I want to avoid rogerbot from crawling a range of pages from let's say /000001/; /000002/... /200000/ ?
Thanks!
-
Hey Martijn,
Thanks!
What about if I want to avoid rogerbot from crawling a range of pages from let's say /000001/; /000002/... /200000/ ?
Thanks!