Moz unable to crawl site
-
I've been using Moz for a long time, and haven't had an issue with the Moz crawl until just recently. I now get the error "We were unable to access your site due to a page timeout on your robots.txt" whenever I try to crawl the site. The robots.txt file hasn't changed and I can view it just fine at https://www.sonomacarpetcleaning.com/robots.txt What could be causing this?
-
Hey Garrett!
While the robots.txt file does load fine in a browser, our crawler is receiving a timeout when trying to access it. I tried to curl from our user-agent and received the same issue.
https://www.screencast.com/t/PDXVg4yf5Y
You may want to reach out to your hosting admin to see if they have changed a setting or added any server level bot security. You may also request that they whitelist our user-agent "rogerbot". If you need further assistance on this, you can always shoot us an email to help@moz.com for further assistance.
-
I'm having the same issue with https://www.nxt-chptr.com . I fixed a redirect issue, but then that hasn't helped. I have tested the Rogerbot through other tools and it said that it was "Allowed." So, not sure what the issue is. Should I just delete the campaign and start a new one?