How to remove URLS from from crawl diagnostics blocked by robots.txt
-
I suddenly have a huge jump in the number of errors in crawl diagnostics and it all seems to be down to a load of URLs that should be blocked by robots.txt. These have never appeared before, how do I remove them or stop them appearing again?
-
Hi Simon,
Noindex Follow meta tag sounds like the way to go.
Best to read this first... http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
Hope this helps.
Justin