Is it possible to have the crawler exclude urls with specific arguments?
-
Is it possible to exclude specific urls in the crawl that contain certain arguments - like you can do in google webmaster tools?
-
You will need to do a block in your robots.txt based on a match:
**User-agent: ***
Disallow: /variable=I think the above would work.
The other approach would be to add a line in your page code itself to generate a 'nofollow' robots tag when that argument is present.
This link should help you, by the way:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449