'Domain Does Not Respond To Web Requests'
-
Hi everyone,
This seems to be a fairly common query on the Q&A section, but I haven't been able to find a solution by reading through previous threads.
When I try to set up an SEOmoz campaign for spryz.co.nz, I get that ol' favourite error message:
'We have detected that the domain spryz.co.nz does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information.'
The problem isn't being caused by a Robots.txt file, and the site has experienced 99% uptime since it was launched.Traffic stats show that visits are coming through to the site via the search engines, suggesting that it may not be all crawlers that fail to access the site.
I've tried to set up this campaign several times throughout the day, since I've read that sometimes Roger goes on the blink, but I've still not been successful.
Any suggestions as to why Roger might be unable to crawl my website would be great.
-
Hi there,
Thanks for reaching out! That is interesting indeed that you aren't able to create a campaign for that particular domain. I do notice that I had trouble at first, but I tried again immediately and was able to go past the no response screen and create a campaign. http://screencast.com/t/H2apiZb0tvB. I have checked with our team and we are not seeing an issue with the crawler so far, however in the past our crawler have had trouble with pages that is missing a Robots.txt (404 when I checked it last) http://screencast.com/t/oacVSiHM3Nyf. However as I have shown, our crawler will now crawl sites in the absence of a robot.txt.
There's a few reason why it could be not responding to our web request:
1. The uptime was sporadic when we made the attempt.
2. Something on the server level is blocking our web service, which is provided by Amazon.Unfortunately the problem is too sporadic, I will start a ticket for you on my end and let's see how my campaign comes back before we decide anything. Hope that helps!
Best,
Peter SEOmoz Help Team.
-
Having the same issue - would prefer not to post the domain name though. It's my first time using SEOMoz, new PRO user, and I'm literally blocked from my first campaign. The site is responsive, hosted on a great server, there's a robots.txt that's intact and is only disallowing access to one particular directory, and the site is currently indexed in Google because I can see the pages.
Methinks there's something wrong on your end - this question has 120 views which means it's more than a few isolated cases. Should I just go ahead and set up the campaign anyway, ignoring the error?