Impact of "restricted by robots" crawler error in WT
-
I have been wondering about this for a while now with regards to several of my sites. I am getting a list of pages that I have blocked in the robots.txt file. If I restrict Google from crawling them, then how can they consider their existence an error? In one case, I have even removed the urls from the index.
And do you have any idea of the negative impact associated with these errors.
And how do you suggest I remedy the situation.
Thanks for the help
-
Hi Patrick,
That section is simply there to advice on any URLs that Google feels are wrongly excluded within the robots.txt
If the URLs are not wrongly excluded, don't worry about it showing in WMT's - it's there just as an advisory.
Good luck!
-
Google is just showing you a warning that hey, these are excluded, make sure that you want them excluded. They're not passing a judgement on whether or not they should be excluded. So, as long as they're excluded on purposes, no worries.