Meta No INDEX and Robots - Optimizing Crawl Budget
-
Hi,
Sometime ago, a few thousand pages got into Google's index - they were "product pop up" pages, exact duplicates of the actual product page but a "quick view".
So I deleted them via GWT and also put in a Meta No Index on these pop up overlays to stop them being indexed and causing dupe content issues.
They are no longer within the index as far as I can see, i do a site:www.mydomain.com/ajax and nothing appears -
So can I block these off now with robots.txt to optimize my crawl budget?
Thanks
-
Are you still linking to those pages? If so, I would just keep the noindex,follow meta in the header. That way you still benefit from the link juice flowing to these pages, link juice that would be lost if you just blocked them from being crawled via robots.