How to prevent directory from being accessed by search engines?
-
Pretty much as the question says, is there any way to stop search engines from crawling a directory? I am working on a Wordpress installation for my site but don't want it to be listed in search engines until it's ready to be shown to the world. I know the simplest way is to password-protect the directory but I had some issues when I tried to implement that so I'd like to see if there's a way to do it without passwords. Thanks in advance.
-
I'm assuming you want all search engines blocked from this directory. If so, edit your robots.txt file to state the following. This will block all bots from accessing a folder/directory on your site
User-agent: *
Disallow: /directoryname
-
I don't have a robots.txt file in my root. Do I just create a text file, put the above lines into it, and upload it to my root after changing the name?
-
shouldn't you put a slash at the end of the directory in the robots file?
you can create the robots file through the Google Webmaster Tools
-
If you don't have an robot.txt file, you need to include some important stuff first.
First, do you have a sitemap.xlm for your website? If not, its very important and you should creat it at: http://www.xml-sitemaps.com/
Create a robot.txt file and include the follow:
User-agent: * allow: / disallow: /directoryname
Sitemap: http://www.yousite.com/sitemap.xmlWith this you will inform all robots where is your sitemap. You should read more about robots.txt in this great post: http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
-
As it is, my site is just phpBB3 forums (www.bearsfansonline.com); would a sitemap really help that much?
-
It's not required to have the ending slash. At least, it works for us without it.
-
Xee,
It always help, and it is very easy to implement. This function to show the path to the sitemap ir very good.
-
The robots.txt file does not guarantee that your pages will not show up in search results! Your best bet after password protection is adding a NoIndex meta tag to you page headers.
Google have openly said that they obey this tag (Matt Cutts).
-
You're absolutely right! I left that part out. Thanks
-
But don't forget to remove that Disallow out of Robots.txt when you go live - if you want those pages to be indexed (and also the Meta-robots noindex nofollow).
Otherwise you might be pulling your hair out trying to figure out why none of your pages are getting indexed in the SERPs.