¿Disallow duplicate URL?
-
Hi comunity,
thanks for answering my question.
I have a problem with a website. My website is:
http://example.examples.com/brand/brand1 (good URL)
but i have 2 filters to show something and this generate 2 URL's more:
http://example.examples.com/brand/brand1?show=true (if we put 1 filter)
http://example.examples.com/brand/brand1?show=false (if we put other filter)
My question is,
should i put in robots.txt disallow for these filters like this:
**Disallow: /*?show=*** -
Hi There,
In my opionion yes, you can however i suggest you to read the following post from Moz
https://moz.com/blog/duplicate-content-block-redirect-or-canonical
I hope this will resovle your issue.
-
In my opinion: No, you should not. The result will be that those URLs are indexed but with a standard description to the effect that google knows the URL exists but not what is behind it; titles may be taken from links or past data.
Canonical is the way to go.
Regards
Nico