Exclude parameters from RogerBot
-
I'm getting a ton of duplicate content errors because we use some tracking parameters for navigation tracking. Is there way to exclude these from RogerBot? If not suggestions on how to get these errors from showing in my crawl report?
-
Do your tracking parameters all begin with the same code (ex. ?utm)? You could throw something like this in your robots.txt file:
User-agent: RogerBot
**Disallow: /.*??utm **_Replace ?utm for your tracking parameters. _
Double check the regex, but that should work.
-
Hey Shawn,
I would use robots.txt to target "rogerbot" to not crawl the specific parameters you're concerned about.
Here are some links that might help you out:
Hope that answers your question! -- Andrew