NoIndex user generated pages?
-
Hi,
I have a site, downorisitjustme (dot) com
It has over 30,000 pages in google which have been generated by people searching to check if a specific site is working or not and then possibly adding a link to a msg board to the deeplink of the results page or something which is why the pages have been picked up.
Am I best to noindex the res.php page where all the auto generated content is showing up and just have the main static pages as the only ones available to be indexed?
-
I would also exclude them via robots.txt and then push through a sitemap with your static content to "nudge" Google to recrawl your content (and hopefully drop the other pages off quickly over time).
-
I'd noindex the page, block in robots.txt, make sure your sitemap.xml is not generating these URLs if automated, and if there is a main folder where all the user generated pages are then request removal of that content in Google WMT.
-
Thanks for the replies Gerg & Irving.
The robots.txt block/exclude I take it I can just do that to the res.php page and not have to individually for the 30k generated dynamic pages off it (probably a silly question I know but wanted to double check).
-
Yes, add it to the robots.txt (use a Disallow and a NoIndex statement). I did find that Bing for example has not reliably in the past honoured robots.txt (especially in the case where you have an explicit "index" tag on the page and a noindex for a URL path).