Duplicate Sub-domains Being Indexed
-
Hi all,
I have this site that has a sub-domain that is meant to be a "support" for clients. Some sort of FAQ pages, if you will. A lot of them are dynamic URLs, hence, the title and most of the content are duplicated.
Crawl Diagnostics found 52 duplicate content, 138 duplicate title and a lot other errors.
My question is, what would be the best practice to fix this issue? Should I noindex and nofollow all of its subdomains? Thanks in advance.
-
How are all of these subdomains being created? Do you have just one subdomain you are worried about or multiple subdomains?
If you can do a robots.txt disallow on certain duplicate folders/files that you don't want indexed, that might help.
Let me know some more info first.
Scott.
-
The site has a few subdomains. For example (below examples are not my site):
- https://support.medialayer.com/index.php?_m=knowledgebase&_a=view
- https://support.medialayer.com/index.php?_m=knowledgebase&_a=viewarticle&kbarticleid=9&nav=0
So basically, the pages got indexed, with the same title, and partial/whole duplicate content.
So, by simply making some changes on the robots.txt will fix this issue? Thanks.
-
If you don't want these indexed, first put a noindex tag on all pages. Leave the follow alone as the engine still needs to find the pages to change the index status.
Add the domain to GWMT then request a removal all the pages.
Allow this to take effect then add a robots disallow to the entire sub-domain.
Your domain then be cleaned from the index and the duplication won't be an issue.
-
Thanks a lot! I will definitely try that.