Google is Still Blocking Pages Unblocked 1 Month ago in Robots
-
I manage a large site over 200K indexed pages. We recently added a new vertical to the site that was 20K pages. We initially blocked the pages using Robots.txt while we were developing/testing. We unblocked the pages 1 month ago. The pages are still not indexed at this point. 1 page will show up in the index with an omitted results link. Upon clicking the link you can see the remaining un-indexed pages. Looking for some suggestions. Thanks.
-
Hi,
Fetch the main page(s) with "Fetch as Google" under the Crawl section in Webmaster tools - then submit to the index.
You are sure that there are no other elements blocking the indexing of the page (like meta tag or X-robots tags in the header?
Also fetch the new robots.txt file - to be sure that Google notices that it has changed.
Did you add a sitemap for this new section - does it show any notifications/warnings in WMT?
rgds,
Dirk
-
Thanks. I fetched both the main page and made a slight tweak to the robots and resubmitted last night. It looks like it is making a bit of progress. There is nothing else blocking the pages. We did add a new sitemap when we first launched the pages with no warnings. I did notice yesterday that we had approx 268 broken links that went to 404 pages in WMT in this specific sub-folder. The discovery of the broken link-404 pages by Google seems to be around the same time Google stopped crawling this section of the site. We took care of the broken links this morning. Thanks for the help!