How to 100% stop google indexing static and dynamic URLs
-
This post is deleted! -
I would block it on the robots.txt file and also add the meta tag to the page but the robots txt will be the first choice.
Another options you could look at is adding canonical tags to pages if you have duplicate content issues, great information on these tags can be found here:
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
-
Hi Matt.
Robots.txt works when you are dealing with a shopping cart or CMS software which does not offer the granularity of SEO options we desire. Otherwise the best robots.txt file is a blank one.
If you decide to use the "noindex" option that is fine, but I would recommend against using the "nofollow" tag as it will stop your PR from flowing naturally within your own site.
From looking at your site it seems the URLs you offered are to a generic booking page with no content that exist to allow users to reserve cottages for specific dates. I don't see any value in adding this page to a search index. If you agree then the nofollow tag is your solution.
Two additional tidbits: Lindsay wrote a nice article related to this topic that I will recommend to you: http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not
Google Webmaster Tools (and Bing) offers an option to change how they view parameters in your URL. From your dashboard > Site Configuration > URL parameters > you would click "Add parameter" then enter cottage. This is for future reference and is not necessary in this case if you properly noindex the pages.
-
I would personally advice against placing the noindex in two places. Either one of the works, and adding them in two places might just confuse future developers about the reasons behind the double noindex.
-
Hence why I said adding the robots.txt block would be the first choice.