Why my site not crawl?
-
my error in dashboard:
**Moz was unable to crawl your site on Jul 23, 2020. **Our crawler was banned by a page on your site, either through your robots.txt, the X-Robots-Tag HTTP header, or the meta robots tag. Update these tags to allow your page and the rest of your site to be crawled. If this error is found on any page on your site, it prevents our crawler (and some search engines) from crawling the rest of your site. Typically errors like this should be investigated and fixed by the site webmaster
i thing edit robots.txt
how fix that?
-
Your blocking a crawler to access your page. Clear it.
-
thanks
with below code ? :
User-agent: rogerbot
Disallow:
-
You could just put the sitemap location in your robots.txt, and call it a day. Blocking robots is'nt really a good thing to be honest unless you really have to.
-
me added in robots.txt below code:
User-agent: rogerbot
Disallow:
but not fixed and error in my dashboard
-
Well, do you have pages that have Noindex, nofollow? If so, please check and verify. 2nd: you can leave your robots.txt file pretty much empty except for a link to your sitemap, i.e
sitemap: https:// www. somesite.xyz /sitemap.xml
There's no need to block bots really unless specific pages need to be blocked. it wont stop bad crawlers either putting that into your robots.
-
Hey - Thanks for posting.
One thing to note here is that Moz doesn't read sitemaps. We do check robots.txt files for directives, but then crawl starting at the campaign seed URL, and work our way down in depth.
If your site is still not being crawled, I would suggest reaching out to help@moz.com with the campaign in question so we can take a look.
thanks!