Why is Roger crawling pages that are disallowed in my robots.txt file?
-
I have specified the following in my robots.txt file:
Disallow: /catalog/product_compare/
Yet Roger is crawling these pages = 1,357 errors.
Is this a bug or am I missing something in my robots.txt file?
Here's one of the URLs that Roger pulled:
<colgroup><col width="312"></colgroup>
|Please let me know if my problem is in robots.txt or if Roger spaced this one. Thanks!
|
-
Have you specified a User-Agent?
-
Yes, blocking all --> *
-
Digging back through the Q&A... I'm several posts reporting this sort of thing.
http://www.seomoz.org/dp/rogerbot
Perhaps you could try specifically blocking rogerbot? If that doesn't work, an email to the SEOmoz team may do the trick

-
Digging in further I discovered that rogerbot had blocked a portion of these URL variations, but 2/3 slipped through. I sent an email to support. Thanks for the suggestion.