Duplicate content due to "Email a Friend" and "PhotoGallery"
-
Crawl Diagnostics gives me 1650 duplicate page content errors. 800+ are for photo gallery, the link upon which my large image is shown. 800+ are for the Email a Friend form page.
Presumably SEOMoz's tool is getting there by following the link inside a product page, which as you can see will differ only by the ProductCode of every product.
www.completemobilehomesupply.com/PhotoGallery.asp?ProductCode=anchor101
www.completemobilehomesupply.com/EmailaFriend.asp?ProductCode=Shutter0011 - Is it critical to fix?
I am assuming Yes, but if for some reason the answer is No, please share.2 - Any idea on how to fix?
The site uses Volusion, fyi, so it may be a limiting factor of the platform.Thank you for your time.
Also if this topic has been previously covered, please link and I'll read there instead. -
if possible, disallow the pages that cause problems in your robots.txt.
-
I have similar problems with Volusion. Curious to read about a solution....
-
Yes, use the robots.txt file.
I had my email a friend links in the robots file, then I saw a post here that suggested using robots was bad and that a META noindex,nofollow was better, so I changed to that and it was a disaster.
Google started fetching and indexing those pages. They were even ranking above our content.
And they were forcing our content into supplementary results.
I changed it back to robots, and that fixed the problem, but I think it took a week to undo the damage.
You can use a wildcard asterisk in the robots file, but I don't recommend using more than one wildcard per line.
Disallow: /emailafriend/
-
The code above will disallow a directory called emailafriend. Use the following to disallow the two .asp files.
User-agent: *
Disallow: /EmailaFriend.asp
Disallow: /PhotoGallery.asp
Read more about Robots.txt here --> http://www.robotstxt.org/orig.html
-
Great, thanks guys, will disallow those two pages in robots.txt.