Access denied errors in webmaster tools
-
I notice today Ihave 2 access denied errors. I checked the help which says:
Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.)
Therefore I think it may be because I have added a login page for users and googlebot can't access it. I'm using wordpress and presume I need to amend the robots.txt to remove the requirement for google to log in but how do I do that?
Unless I'm misunderstanding the problem altogether!
-
Personally I wouldn't remove the requirement for googlebot if you want to protect that content. Once you do that content is public for two reasons:
1. Any nefarious user can instantly change their user agent and view your content
2. It will appear in the public search results
Much better would be to consider what is and isn't protected by a login. If you want it indexed then it should be seen as public and not behind a login. If it should be private, then protect it.
If you want it to appear in the search to attract people to the site in order to encourage them to register consider having cut down versions that are public and have a strong call-to-action on those to get people to register.
-
Samuel
Mat's right - you need to speratae the "access denied" pages into two buckets;
-
those which you want indexed
-
those which you don't want indexed.
For those you want indexed, take them on a page by page basis to figure out why access is being denied. They might not all be because of password protection.
-Dan
-