Questions
-
Removing URLs in bulk when directory exclusion isn't an option?
I'd go into Google Webmaster Tools and their parameter settings and tell them to ignore this parameter. I would need to look up the exact syntax, but Google does accept some dynamic exclusions and parameters in robots.txt, and you may be able to put that into robots and then use the URL removal tools.
Intermediate & Advanced SEO | | KeriMorgret1 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I always advise people NOT to use the robots txt to block off pages - it isnt the best way to handle things. In your case, there may be two options that you can consider: 1. For variant pages, (multiple parameters of the same page) use the rel canonical to increase the strength of the original page, and to keep the variants out of the index. 2. A controversial one this, and many may disagree, but depends on situation basis - allow crawling of the page, but dont allow indexing - follow, no index, which would still pass any juice, but wont index pages that you dont want in the SERPs. I normally do this for Search Result Pages that get indexed...
Intermediate & Advanced SEO | | rishil0