5xx Crawl Issue might not be issues at all. Help
-
Hi,
I ran a crawl test on our website and it came back with 900 5xx potential errors. When I started opening these links 1 by 1 I could see they were actually working. So i exported the full list of 900 and went to the website: https://httpstatus.io/ pasted the links by 100 and used that. They came back with status codes of 301 / 301 / 200 which i believe means they are okay.
After reading it says that my programmer may need to see if we are blocking the MOZ BOT or to slow the MOZ BOT down. I guess I'm wondering if this is not done is the site actually having these 5xx errors when Google is Crawling or is it just showing 900 errors because of MOZ BOT but actually things are okay?
I know the simple answer is to get the programmer to fix the MOZ BOT issue to know for sure but getting programmers to do things take a lot of time so I'm trying to get a better idea here.
Thanks for your input.
-
Hi there!
Thanks so much for the great question! I'm so sorry to hear you're having this trouble with the 5xx errors. To resolve this we'd recommend adding a crawl delay for rogerbot to your robots.txt file. That crawl delay would look something like this:
User-agent: rogerbot
Crawl-delay: 10This will tell our crawler to slow down when it's crawling. We do not recommend using a crawl delay of longer than 10 as this can keep the crawl from completing.
As far as whether this is impacting Google's ability to crawl, I'm really not able to help identify that. I'm so sorry about that! The best suggestion I can make would be to check the server logs for your site to see how it is responding to other crawlers you may be concerned about.
If you have any other questions about rogerbot or the our tools, please feel free to send an email on over to help@moz.com.
