Question about robots.txt
-
Solved!
-
Consider deleting all of this:
Disallow: /&limit
Disallow: /?limit
Disallow: /&sort
Disallow: /?sort
Disallow: /?route=checkout/
Disallow: /?route=account/
Disallow: /?route=product/search
Disallow: /?route=affiliate/
Disallow: /?marca
Disallow: /&manufacturer
Disallow: /?manufacturer
Disallow: /?filter
Disallow: /&filter
Disallow: /?order
Disallow: /&order
Disallow: /?price
Disallow: /&price
Disallow: /?filter_tag
Disallow: /&filter_tag
Disallow: /?mode
Disallow: /&mode
Disallow: /?cat
Disallow: /&cat
Disallow: /?product_id
Disallow: /&product_id
Disallow: /?route=affiliate/
Disallow: /*?keywordThose rules are telling Google not to crawl domain.com/EVERYTHING(then the URL parameter). This could be where the issue stems from. If you're worried about URLs with these things ranking, consider implementing canonical tags instead to point to the proper pages
-
Just a friendly reminder. Please don't delete your question after it's been answered. It's very likely that someone in the future could have the same question and they would have been able to find the answer if you hadn't deleted the question.