Ignore Urls with pattern.
-
I have 7000 warnings of urls because of a 302 redirect.
http://imageshack.us/photo/my-images/215/44060409.png/
I want to get rid of those, is it possible to get rid of the Urls with robots.txt.
For example that it does not crawl anything that has /product_compare/ in its url?
Thank you
-
Could you perhaps post a URL which has product_compare in it?
You could alter your robots.txt file to disallow robots to index pages in
http//www.domain.com/product_compare/ by adding this line to your robots.txt file: Disallow: /product_compare/
-
-
Then you simply add this to your robots.txt:
Disallow: /catalog/product_compare/
That should leave out all pages starting with:
https://www.theprinterdepo.com/catalog/product_compare/ -
in case they do not all start with /category
Disallow: product_compare
