Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
-
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare).
We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain.
If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains?
Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error!
Cheers,
Dave.
-
Dave, I had exactly the same issue a month ago with being indexed on subdomains but was able to modify robots.txt in root domain swiftly enough to avoid real damage. Your main root robots.txt can override the subdomains. Simply disallow the subdomain in your robots file disallow: /subdomain url/robots.txt OR, if im not mistaken simply remove it altogether from the subdomain
-
Hi Dave. A workflow checklist should really help with this as well. There are probably a few other items you'll catch by meeting with the others involved and getting everyone on the same page. Cheers!