Need help with Robots.txt
-
An eCommerce site built with Modx CMS. I found lots of auto generated duplicate page issue on that site. Now I need to disallow some pages from that category. Here is the actual product page url looks like
product_listing.php?cat=6857And here is the auto generated url structure
product_listing.php?cat=6857&cPath=dropship&size=19Can any one suggest how to disallow this specific category through robots.txt. I am not so familiar with Modx and this kind of link structure.
Your help will be appreciated.
Thanks
-
Nahid, before you use the robots.txt file's disallow for those URLs, you may want to reconsider. You may want to use the canonical tag instead. In the case where you have different sizes, colors, etc. we typically recommend using the Canonical Tag and not the disallow in robots.txt.
Anyhow, if you'd like to use the disallow you can use one of these:
Disallow: /?
or
Disallow: /?cat=
-
I would actually add a canonical tag and then handle these using the Parameters section of Search Console. That's why it's there, for exactly this type of site with exactly this issue.