Duplicate content harms individual pages or whole site?
-
Hi,
One section of my site is a selection of Art and Design books. I have about 200 individual posts, each with a book image and a description retrieved from Amazon (using their API).
Due to several reasons not worth mentioning I decided to use the Amazon description. I don't mind if those pages rank well or not, but I need them as additional content for my visitors as they browse my site. The value relies in the selection of books.
My question is if the duplicate content taken from Amazon harms only each book page or the whole site.
The rest of the site has unique content.
Thanks!
Enrique
-
In a blog post titled "Beating Google's Panda Update - 5 Deadly Content Sins"..... Cyrus says.....
"Remember, Panda is a site-wide penalty, not a page penalty. So if a certain percentage of your pages fall below Panda’s quality algorithm, then the whole site suffers. Fix enough of these pages and you may recover."
-
Being dinged for duplicate content would be penalized as whole on your website, not just by page.
If the post is original and you just have a picture of the book with a sentence or two of it's description, I do not think that would harm or trigger a filter on your website.
They usually go for "substantive blocks of content within or across domains that either completely match other content or are appreciably similar"
-
Thanks!
That sounds bad indeed. I need those pages in my site, so should I block them from bots, or add a nofollow or something?
Enrique
-
We had a site that republished government documents.
After the Panda update we decided that we wanted to keep some of those pages on the site for visitors but remove them from the search index. So we added this to the head....
name="robots" content="noindex, follow" />
There were other pages that we decided to delete. So we removed those files from the server and added a 301 redirect to an .htaccess file in the folder where they were located.
-
Thanks Egol! I guess I will do that.
Enrique