Cross Domain Rel Canonical tags vs. Rel Canonical Tags for internal webpages
-
Today I noticed that one of my colleagues was pointing rel canonical tags to a third party domain on a few specific pages on a client's website. This was a standard rel canonical tag that was written
Up to this point I haven't seen too many webmasters point a rel canonical to a third party domain. However after doing some reading in the Google Webmaster Tools blog I realized that cross domain rel canonicals are indeed a viable strategy to avoid duplicate content.
My question is this; should rel canonical tags be written the same way when dealing with internal duplicate content vs. external duplicate content? Would a rel=author tag be more appropriate when addressing 3rd party website duplicate content issues?
Any feedback would be appreciated.
-
No, rel="canonical" is the same internally or cross-domain. You are just telling Google which copy of that content to serve, wherever it is. (And rel="author" is no longer used by Google to show authorship in results nor is it tracking data from content using that markup. http://searchengineland.com/goodbye-google-authorship-201975 https://plus.google.com/u/0/+JohnMueller/posts/HZf3KDP1Dm8 )
-
I have used rel=canonical on a few pages of content that were published on two of my websites. T
-
Rel=canonical pointing to a different domain is essentially telling Google "here's the original copy of this article".
That's fine if you choose to reprint just the occasional bit of content from somewhere else.
It's also a fine strategy to use in a white-label system, where you might have the same content published across a number of sites, all branded differently.
But you want to use this sparingly. If you've got a site with 1000 pages, and 750 of those pages are rel=canonicalled back to another domain, essentially you're telling Google that most of your website is just republished stuff that somebody else wrote. That's not going to be a good signal for Google of the likely quality of the site in general.
If you're in a situation where you really do need to publish a lot of pages on multiple sites, and all of the sites do need to be found in search for SOME terms, then for those duplicated pages, I'd noindex them on the "copy" sites, so that in the example above, Google would only see and index 250 pages, all of which would be original content.
-
Excellent response. Thanks Michael.