Sub Domains and Robot.txt files...
-
This is going to seem like a stupid question, and perhaps it is but I am pulling out what little hair I have left.
I have a sub level domain on which a website sits. The Main domain has a robots.txt file that disallows all robots. It has been two weeks, I submitted the sitemap through webmaster tools and still, Google has not indexed the sub domain website. My question is, could the robots.txt file on the main domain be affecting the crawlability of the website on the sub domain? I wouldn't have thought so but I can find nothing else.
Thanks in advance.
-
The way that Google finds robots.txt files is by taking your URL, and adding /robots.txt to it. So a good way to see if the robots.txt file is affecting your subdomain is to go to subdomain.domain.com/robots.txt. If the file exists, then it is affecting your subdomain. If it doesn't, then it's only active on your main domain.
Getting indexed is function of having unique content and pagerank, so make sure your subdomain has unique content and links if you're having trouble getting it indexed. Submitting a sitemap is no guarantee that Google will index your site.
-
Thank you, Mr. Young. I believed this to be the case (that it wasn't the robots.txt file) but I could think of nothing else. I have since been indexed.