Questions
-
Google index dymamic webpages after block in robots.txt...
Unfortunately, Robots.txt is a poor choice for content that may have already been indexed, including dynamic content. It's good for blocking specific pages and folders (especially prior to Google crawling them), but it tends to be unreliable in these situations. Pagination is a tricky topic, and the "best" solution varies a lot with the situation, but the basic options are: (1) Use rel="prev" and rel="next", which helps Google handle the paginated series properly, but still allows it to rank. (2) Use META NOINDEX, FOLLOW on pages 2+ of search results (this was probably the most popular method before rel=prev/next). (3) Use rel=canonical to point all paginated results to a "View All" page. This page should be available to users and not be too large. It's a decent option if you have a few dozen results, but not 100s or 1000s. (4) Use Google Webmaster Tools parameter handling on the "page=" parameter. It seems to work, but since it's Google-specific, it's not the go-to option for most SEOs.
Technical SEO Issues | | Dr-Pete0