Url blocked by robots.txt errors - Search Console
-
Hi,
I'm having some ongoing issues in submitting sitemaps to Google Search Console.
I've created the sitemaps using the Google XML plugin as normal.
When I test a map, I get this error - Url blocked by robots.txt errors followed by this note - Sitemap contains urls which are blocked by robots.txt.
I ran the robots tester and it is good. Checked the pages to confirm they are set to be indexed. A clean open robots.txt file has been uploaded and verified in search console.
I've done this process many times before all ok but, I am at a loss as to why this is happening this time.
Really appreciate any assistance here.
Thanks.
-
Sometimes you get that error if a URL in the sitemap redirects via a blocked URL in robots.txt - could be that?
-
Hi,
Thanks for this. I actually figured it out. There was a broken link hidden on the page. Inspected it to find it and removed it.
Indexing now. Cheers!