Robots.txt issue - site resubmission needed?
-
We recently had an issue when a load of new files were transferred from our dev server to the live site, which unfortunately included the dev site's robots.txt file which had a disallow:/ instruction. Bad!
Luckily I spotted it quickly and the file has been replaced. The extent of the damage seems to be that some descriptions aren't displaying and we're getting a message about robots.txt in the SERPs for a few keywords. I've done a site: search and generally it seems to be OK for 99% of our pages.
Our positions don't seem to be affected right now but obviously it's not great for the CTRs on those keywords affected.
My question is whether there is anything I can do to bring the updated robots.txt file to Google's attention? Or should we just wait and sit it out?
Thanks in advance for your answers!
-
Hi,
We had a vaguely similar thing happen where we took over the running over a site and the old dev added a robots.txt disallowing the site from being indexed, which wasn't pick up for a while as dev work was on-going. Basically the sites rankings panned out.
We fixed the issue an re-submitted the site through webmaster tools. Site was reindexed within a week or so and rankings came back over the next 6 weeks.
-
I agree with Michael.
I have also seen a wordpress site that had blocked the robots from the entire site for 1 week.
After allowing the robots back in, we saw the rankings improve with in a few days.
Don't stress, just resubmit the sitemap or create a new one with the effected URL's
Greg
-
Hi Greg - I've done this and resubmitted the sitemap, but I'm now getting severe health warnings saying that robots.txt is blocking important pages.
I've run our robots.txt file through the tester in Webmaster tools and it is saying allowed, however I'm obviously concerned about the warnings.
Have you experience of anything like this?
Thanks
Rory