Implications of Disallowing A LOT of Pages
-
Hey everyone,
I just started working on a website and there are A LOT of pages that should not be crawled - probably in the thousands. Are there any SEO risks of disallowing them all at once, or should I go through systematically and take a few dozen down at a time?
-
Hello Rachel,
If you do it correctly, this is not disallowing useful pages, there is no risk. Go ahead.
Best luck.
GR. -
LOL now I'm worried about what you mean by "correctly." Do you mean just making sure that I'm not disallowing valuable pages that should be crawled?
-
Exactly.
Also, remember that disalowing by robots.txt will not remove them from Google´s index.
-
That would happen if I used noindex, correct?
-
Correct.
Also, (i´ve should have said this earlier) there are no effective way to avoid robots crawling the web. The thing here (and what im answering) is to not show (and/or remove if are already in the index) certain pages in google search result.
-
Perfect, that's my intent. Thanks so much for your help!! I really appreciate it.