Questions
-
Is it possible to have the crawler exclude urls with specific arguments?
You will need to do a block in your robots.txt based on a match: **User-agent: *** Disallow: /variable= I think the above would work. The other approach would be to add a line in your page code itself to generate a 'nofollow' robots tag when that argument is present. This link should help you, by the way: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
On-Page / Site Optimization | | Nobody15609869897230