I'm currently having trouble with what appears to be a cached version of robots.txt. I'm being told via errors in my Google sitemap account that I'm denying Googlebot access to the entire site. I uploaded clean and "Allow" robots.txt yesterday, but receive the same error.
I've tried "Fetch as Googlebot" on the index and other pages, but still the error. Here is the latest:
| Denied by robots.txt |
| 11/9/11 10:56 AM |
As I said, there in not blocking on the robots.txt for 24 hours.
HELP!