Sitemaps and "noindex" pages
-
Experimenting a little bit to recover from Panda and added "noindex" tag for quite a few pages. Obviously now we need Google to re-crawl them ASAP and de-index. Should we leave these pages in sitemaps (with updated "lastmod") for that? Or just patiently wait?
What's the common/best way? -
If you have identified the pages that are giving you growing pains, and they are not essential to users, why not just remove them and 301 the old pages?
You can change the crawl rate of Google in Google Webmaster Tools:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=48620
Or you could be a little grey and ping the pages that you wanted to change by posing the links on a high DA (domain athority) site like Twitter. The higher the domain authority the faster Google crawls the site.
-
These are not so bad pages, they are catalog pages, must be there with noindex, follow tags.
Going grey isn't an option unfortunately - there are ~50,000 of them.