Strange strategy from a competitor. Is this "Google Friendly"?
-
Hi all,We have a client from a very competitive industry (car insurance) that ranks first for almost every important and relevant keyword related to car insurance.
But they could always be doing a good job.A few days ago i found this: http://logo.force.com/
The competitor website is: http://www.logo.pt/
The competitor name is: Logo
What I found strange is the fact that both websites are the same, except the fact that the first is in a sub-domain and have important links pointing to the original website (www.logo.pt)
So my question is, is this a "google friendly" (and fair) technique? why this competitor has such good results?
Thanks in advance!!
I look forward to hearing from you guys
-
Is there a chance that you could have found their dev site? Look at the source and robots.txt, is it set to noindex and to disallow?
edit: Actually in looking it up, it is something that sales force is doing. I think it would be considered bad, its duplicated content. Another one that is hosted on the same server is
which is also
It looks like salesforce is copying the websites for some reason.
-
Hi Lesley!
Thanks for your response.
The robots.txt file is exactly the same as the "original" website.
I thought the strategy would be to obtain some benefit in being under a strong DA (79).
And i still find strange the fact that is always ranks first for the most important kws for this industry (very competitive one) but maybe it has something to do with the backlinks.
Thanks again!
-
Be very careful about making assumptions regarding competitors.
Just because you see one thing, does not mean either that one thing is helping or hurting a site. SEO is a vast, complex environment. If a site has enough very strong signals across many areas, one or even a few very poorly executed things may not hurt the site. Or it may not hurt the site "until Google catches up with it".
Duplicate content, regardless of method (within a single site, across multiple domains, across a mix of domains and subdomains" is never a true best practice. Ever. it's artificial, and Google most definitely takes the position that if you are attempting to "artificially" (in their view) manipulate rankings in a non-best-practices manner, that's a violation of their guidelines, policies or TOS.