Duplicate Page Content on pages that appear to be different?
-
Hi Everyone! My name's Ross, and I work at CHARGED.fm. I worked with Luke, who has asked quite a few questions here, but he has since moved on to a new adventure. So I am trying to step into his role. I am very much a beginner in SEO, so I'm trying to learn a lot of this on the fly, and bear with me if this is something simple.
In our latest MOZ Crawl, over 28K high priority issues were detected, and they are all Duplicate Page Content issues. However, when looking at the issues laid out, the examples that it gives for "Duplicate URLs" under each individual issue appear to be completely different pages. They have different page titles, different descriptions, etc. Here's an example.
For "LPGA Tickets", it is giving 19 Duplicate URLs. Here are a couple it lists when you expand those:
http://www.charged.fm/one-thousand-one-nights-tickets
http://www.charged.fm/trash-inferno-tickets
http://www.charged.fm/mylan-wtt-smash-hits-tickets
http://www.charged.fm/mickey-thomas-ticketsInternally, one reason we thought this might be happening is that even though the pages themselves are different, the structure is completely similar, especially if there are no events listed or if there isn't any content in the News/About sections. We are going to try and noindex pages that don't have events/new content on them as a temporary fix, but is there possibly a different underlying issue somewhere that would cause all of these duplicate page content issues to begin appearing?
Any help would be greatly appreciated!
-
Hey Ross!
Those pages are not "different" when it comes to search engines. Or maybe I should say, not different enough. The content is extremely thin and only switching out a word or two will absolutely make them come up as duplicate content. I would strongly suggest optimizing the page content and meta descriptions to be unique.
-
Hey Adam!
thanks for the response, that kind of confirms what we were thinking. So we are planning to put in a noindex follow on those pages while we work on adjusting the content/descriptions. Is that a good fix while we work on the pages or is there something else we should be doing?
-
Well, if it were one of my clients sites… I wouldn't do that. While I understand your logic with a noindex, I wouldn't want to create a situation where the pages would not be about to be found at all in search engines. Although it will drop your duplicate content numbers here on Moz, it's only a temporary fix. I guess a good question to explore is how long you will need to keep them as a noindex versus how long it would take to fix the content issues.
-
Nothing will positively effect this issue more than updating the content and giving the searchers solid, informative, unique content to read.
One way to do that might be to aggregate some reviews for these individual shows, give a short, unique bio of the performers, or rate the venues. 500-800 words of unique content will go a long way in this case.
Something else to work on would be the amount of internal links back and forth. When links are all robot sees, that becomes your duplicate content issue too. You can't do too much about that in this case. Most of the links come from the nav bars, so, the way to counter it would be again, adding great content.