SeoMoz duplicate content
-
Hi all,
There's a lot of controversy around duplicated content.
If you google "The New SEO Process (Quit Being Kanye)" you'll get tones of results with a copied content from the original article.
How does google manages this? (duplicated or not)
Is this good for the original content/website?
Thank you
Cornel
-
I think if u are the first website that google crawls and finds the content - then it will look to google that you published the original content.
If u put the same articles on some article directories it could happen that thy are being crawled faster - so u end up having content with now value (duplicate content) because the found it first on a different page.
Harry

-
Thank you Harry
-
We do get a lot of scrapers
Google doesn't manage it all that well, unfortunately. There are a couple of cues:(1) As Harald said, Google does try to determine which came first. This can be tough, because auto-scrapers actually can get indexed before sources in some cases. Having a solid crawl structure, XML sitemaps, pinging relevant sites, etc. can help.
(2) If the sites link back to you (on purpose or accidentally, by including links you put in the content), it's a signal to Google that you're the source.
(3) If you're a high-authority site, you've generally got an edge. Most of our scrapers are pretty weak sites, so we're not in much dangers. Unfortunately, I've seen times when scrapers outranked original content.
Proper syndication, with back-links or other signals (like syndication-source or cross-domain canonical) can definitely be good for sites. Having a ton of scrapers on a site that's relatively new/weak can be very negative, unfortunately. Then again, most new sites don't have a ton of scrapers, so it does balance out a little.