Robots.txt Disallow: / in Search Console
-
Two days ago I found out through search console that my website's Robots.txt has changed to
User-agent: *
Disallow: /When I check the robots.txt in the website it looks fine - I see its blocked just in search console( in the robots.txt tester).
when I try to do fetch as google to the homepage I see its blocked. Any ideas why would robots.txt block my website? it was fine until the weekend.
- before that, in the last 3 months I saw I had blocked resources in the website and I brought back pages with fetch as google.
Any ideas?
-
Hello Ran,
Just to clarify, in search console, when you go to Crawl-> Robots.txt Tester and in the middle right clicking en el see live robots.txt, it doesnt show the correct file?
It could be that google isnt recrawling the new robots.txt