Crawl Diagnostics Updates
-
I have several page types on my sites that I have blocked using the robots.txt file (ex: emailafriend.asp, shoppingcart.asp, login.asp), but they are still showing up in crawl diagnostics as issues (ex: duplicate page content, duplicate title tag, etc). Is there a way to filter these issues or perhaps there is something I'm doing wrong resulting in the issues that are showing up?
- Ryan
-
Hi Ryan,
At the end of this page you will find several ways to block Roger bot from indexing pages: http://moz.com/help/pro/rogerbot-crawler
I hope it helps,
Istvan
-
I added the pages that it was suggesting to the robots.txt file:
http://www.naturalrugco.com/robots.txt
Most of the pages listed in the high priority errors within moz analytics crawl diagnostics are the emailafriend.asp pages which I've disallowed. Ex: http://www.naturalrugco.com/EmailaFriend.asp?ProductCode=AMB0012-parent
-
Hi Ryan,
try to move the sitemap to the end and leave a space before it. something like this:
User-agent:*
Disallow: /cgi-bin/
Disallow: /ShoppingCart.asp
Disallow: /SearchResults.asp...
...
Disallow: /mailinglist_subscribe.asp
Disallow: /mailinglist_unsubscribe.asp
Disallow: /EmailaFriend.asp