How to block google robots from a subdomain
-
I have a subdomain that lets me preview the changes I put on my site.
The live site URL is www.site.com, working preview version is www.site.edit.com
The contents on both are almost identical
I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content.
Is it the right way to do it:
User-Agent: *
Disallow: .edit.com/*
-
Hi,
The Google Robots will look for the robots.txt in each individual root. So you need the robots.txt in the root of the subdomain not just the domain root. That's why its also possible to include a complete disallow in there and not just: .edit.com/* .
Example:
User-agent: *
Disallow: /Hope this helps!
-
Thanks a lot for your answer, Martijn!
So just to make sure I got it correctly - this robots file URL should be:
?
Thanks a lot for your answer
-
Hi,
Probably without the www. so: site.edit.com/robots.txt because otherwise you would have a subdomain of a subdomain ;-). But the rest is perfect!
-
Thanks o much for your help!