URL Parameters
-
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+
Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled.
Our robotos.txt files shows now:
# Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/ -
Hi,
You might want to read this article on faceted navigation on the google webmaster blog which gives some good advice on how to handle the situation. What to use depends a bit on your actual situation.
Options include using a nofollow links / use a separate subdomain or block in robots.txt (using a separate folder).On Moz there is this article (the part of faceting) - its mainly about listing sites - but the core problem is more or less similar.
Hope this helps,
Dirk