Using folder blocked by robots.txt before uploaded to indexed folder - is that OK?
-
I have a folder "testing" within my domain which is a folder added to the robots.txt. My web developers use that folder "testing" when we are creating new content before uploading to an indexed folder. So the content is uploaded to the "testing" folder at first (which is blocked by robots.txt) and later uploaded to an indexed folder, yet permanently keeping the content in the "testing" folder. Actually, my entire website's content is located within the "testing" - so same URL structure for all pages as indexed pages, except it starts with the "testing/" folder.
Question: even though the "testing" folder will not be indexed by search engines, is there a chance search engines notice that the content is at first uploaded to the "testing" folder and therefore the indexed folder is not guaranteed to get the content credit, since search engines see the content in the "testing" folder, despite the "testing" folder being blocked by robots.txt? Would it be better that I password protecting this "testing" folder?
Thx
-
As long as the correct robots.txt setting has been applied to the /testing folder, then you do not have anything to worry about.
Considering that it is a staging environment, I would recommend securing it with a password just to be safe and secure with your non-production site content.
-
Yep, just to jump in on the above, if a competitor is paying attention to your robots.txt file, they might notice a sweet stash of content under the /testing folder that they can nab. I have actually seen something similar happen in the past in a competitive SEO niche, so something to bear in mind.
-
good observation....