How to get rid of the message "Search Engine blocked by robots.txt"
-
During the Crawl Diagnostics of my website,I got a message Search Engine blocked by robots.txt under Most common errors & warnings.Please let me know the procedure by which the SEOmoz PRO Crawler can completely crawl my website?Awaiting your reply at the earliest.
Regards,
Prashakth Kamath
-
That error is pretty straight forward and indicates you have a robots.txt file which is blocking the crawler from accessing your site. The robots.txt file can be read by going to your site URL and adding /robots.txt to it such as www.mysite.com/robots.txt.
The file can be found in the root directory on your site's web server. Remove or alter the file to allow search engines to crawl your site. More info can be found at http://www.robotstxt.org/
-
Thanks Ryan for your immediate reply.
Can you please provide name & the code of the SEOmoz Crawler that I need to enter on my file so that the SEOmoz crawls all the webpages of my website.Apart from SEOmoz Crawler I don't want any other crawler to crawl my website?Please help.Awaiting your reply.
Regards,
Prashakth Kamath
-
The seomoz user agent is named rogerbot. You can read more about the SEOmoz crawl process here: http://seomoz.zendesk.com/entries/20034082-lesson-5-crawl-diagnostics
<code>User-agent: rogerbot Allow: /</code> -
Hi Sagar
That was a good reply from Ryan.
Check out http://www.seomoz.org/dp/rogerbot
rogerbot is the name of the SEOmoz crawler bot, the above page has all the info you require.
Regards
Simon
-
Thanks Ryan for the info.Will check and revert back if there is any issues.
Regards,
Prashakth Kamath
-
Thanks Simon for the info.Will check and revert back if there is any issues.
Regards,
Prashakth Kamath