Website blocked by Robots.txt in OSE
-
When viewing my client's website in OSE under the Top Pages tab, it shows that ALL pages are blocked by Robots.txt. This is extremely concerning because Google Webmaster Tools is showing me that all pages are indexed and OK. No crawl errors, no messages, no nothing. I did a "site:website.com" in Google and all of the pages of the website returned.
Any thoughts? Where is OSE picking up this signal? I cannot find a blocked robots tag in the code or anything.
-
Have you looked at your robots.txt file to see if you are blocking specific bots? Visit yoursite.com/robots.txt and check whether you have something like this:
User-agent: [example]
Disallow: /But you may have something else to specify that Googlebot is allowed to crawl the site:
User-agent: googlebot
Allow: / -
Thanks for responding - I did, and I noticed that we are blocking a bunch of other spiders including the spider that crawls for OSE. So, that explains why they cannot retrieve the data.
Again, thanks.
-
No worries - glad to help!