Recovering from disaster
-
Short Question: What's the best way to get Google to re-index duplicate URLs?
Long Story:
We have a long ago (1997) established website with a proprietary CMS. Never paid much attention to SEO (other than creating a sitemap) until four months ago. After learning some we started modifying the engine to provide better site to google (proper HTTP codes, consistent URLs to eliminate duplicates - we had something like 15,000 duplicates - etc...)
Things went great for three and half months and we reached the first page on google for our main keyword (very, very competitive keyword). Before the SEO we were getting around 25,000 impressions and 3000 clicks on google. After our SEO efforts, we reached 70,000 daily impressions and more than 7000 daily clicks.
On Aug 30th, 2014, one of our programmers committed a change to the live server by mistake. This small change effectively changed every article's URL by adding either a dash at its end or a dash and a keyword '-test-keyword' (literally).
Nobody noticed anything until two days later as the site worked perfectly for humans. The result of this small code change is that within five days our site practically disappeared from Google's results pages except when one searched for our site's name. Our rank dropped from 8 and 10 to 80 and 100 for our main keywords.
We reverted the change as soon as we noticed the problem, but during those two days, Google's bots went on a binge crawling five times the usual number of page crawled per day.
We've been trying to recover and nothing seems to be working so far. Google's bots aren't crawling the repaired URLs to get the 301 headers back to the original URL and now we still have over 2300 duplicates as reported by the webmaster tools.
Our Google impressions and clicks dropped to way below what we had before we did any SEO, down to 5000 impressions and 1200 clicks (inclusive of our direct domain name search).
During the last 15 days (after we fixed the problem), our duplicate count went from a maximum of 3200, down to 1200, then back up to 2300 without any changes on our end.
we've redone our sitemap and resubmitted it on day 3.
So, what do we do? Do we go through the URLs with 'fetch as Google' function? (that's a bit tedious for 2300 URLs) or we wait for the bots to come around whenever they feel like it? if we do this, should we submit the bad URL, have google fetch it, get the redirect, follow it and then submit the followed URL to the index?
Or is there a better solution that I'm unaware of?
Second question: Is this something to be expected when something like this happens knowing that our inbound link rarely link to the actual articles?
-
Once you have all the 301 redirects set up, create a sitemap with all of the old urls and submit that. Google will crawl them and see that they are now 301 redirects and process the data faster. then delete the sitemap.
You should also have a canonical tag on the article pages with the new/current link that should be indexed.
"knowing that our inbound link rarely link to the actual articles" --> not sure I follow.
In general, your rankings should bounce back once google picks up on all of the fixes.
-
["knowing that our inbound link rarely link to the actual articles" --> not sure I follow.]
I asked whether it's normal for all ranking to drop even for unaffected pages when pages with no inbound link have issues. For example, our top ranked page for our main keyword didn't change in anyway, not its URL, its description nor its title, yet it's rank tanked after this event.
I like the temporary sitemap idea. Thanks.
-
Well, after submitting multiple temporary sitemaps and having Google index them, our duplicate counts dropped back to pre-event levels.
However, our rankings haven't improved at all. Actually, if anything, they dropped even further.
At this point it's really starting to look like this is a hit from Panda 4.1 and that we had our URLs change was merely a coincidence. From the looks of it, Google is now marking our site as a low quality site. Now that we know about such a thing, we definitely experienced a 'sinister surge' prior to disaster striking.
Since we've never engaged in any bad behavior on the site and we've always followed google's best practice advice, we're currently at a loss of what could be the reason that we're hit that way. Our content is fresh and high quality (arguably the highest quality in our domain), we have a very decent link profile according to MajesticSEO, so for now, no clue about what's going on really.
Attached is the site's impressions and clicks graph from Webmaster Tools.