Is this going to be seen by google as duplicate content
-
Hi All,
Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc.
What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl?
Any assistance that you can offer would be greatly appreciated.
Thanks,
Tim
-
Highly likely the answer is yes, the best way to check is to see if the different URL's are listed.
Personally I would be slow to disavow, there are plenty of options including using a rel=canonical tag (best option) or changing your URL structure.
Interesting article here https://moz.com/learn/seo/duplicate-content
-
I would not disavow the link at all. If you want the /some_product page to be seen as one stand alone product simply implement a canonical meta tag to the page which in effect tells google what page is the original source file.
rel="canonical" href="/some_product" />
Hope that helps.
-
Thanks guys, just realised I said disavow, I mean to say updates robots.txt to disallow: feedurl
Would that be worth implementing prior to rel=canonical on these pages, with our current dev timeline, this would not be done until the new year at this point.
-
I would go with rel=canonical straightaway, robots.txt is a bit harsh for that sort of thing. You could end up delisting yourself

-
This post is deleted!