Crawl Diagnostics returning duplicate content based on session id
-
I'm just starting to dig into crawl diagnostics and it is returning quite a few errors. Primarily, the crawl is indicating duplicate content (page titles, meta tags, etc), because of a session id in the URL.
I have set-up a URL parameter in Google Webmaster Tools to help Google recognize the existence of this session id. Is there any way to tell the SEOMoz spider the same thing? I'd like to get rid of these errors since I've already handled them for the most part.
-
Hi Cody,
The best way is to block Rogerbot within your Robots.txt from crawling specific pages of your site. In your case protecting Rogerbot from seeing the pages with a session ID.
More information could be found here on Rogerbot.Be cautious and test it out, but the lines you would have to add to your Robots.txt are probably:
User-agent: rogerbot
Disallow: /*sessionidHope this helps!
-
You the man! Thanks!