If it's temporary you might want to consider 302 redirecting instead. For a time (at least) the redirected-to URL (homepage) will appear for the original ranking queries. If you're not satisfying the query on the single page you may see a big increase in bounce rate.
Posts made by richardbaxter
-
RE: Reducing multi-page website to one page & SEO ramifications?
-
RE: Url blocked by robots.txt errors - Search Console
Sometimes you get that error if a URL in the sitemap redirects via a blocked URL in robots.txt - could be that?
-
RE: SEO Advice for Angular JS
Hi Sara
Try to avoid hashbangs in the URL - it's much better to declare the escaped fragment in the meta header () and use the $location service in Angular to present the URLs in the browser as full, and hashbang-less!
I've written about this topic extensively, the best thing I could suggest you did was take a look at this blog post: http://builtvisible.com/javascript-framework-seo/
In particular take note of the suggestion of why not to use #! in the URL and a comment around your testing approach, too.
Hope this helps!
Richard
-
RE: How does Google index pagination variables in Ajax snapshots? We're seeing random huge variables.
100% of my experience in this situation is from using Angular.js with Phantom rendering the snapshot, so I tend to use the meta fragment directive in the page header (because I don't use #!'s). With that said I do think my debugging / test experience might be useful, so I'll splurge it out here just in case.
For the record I don't think this is a simple case of Google fabricating URLs - I think it's worth making sure there's not something happening in-between. The real reason tends to come out in testing.
Have you looked in your log files at requests specifically containing your ?view_3_page= parameter? I'd get a sample of Googlebot requests and look for that parameter. Every time I've come across this problem so far, it's been all about your framework not responding well to the parameter ordering in the URL when combined with the escaped_fragment= parameter.
Sometimes, when the request is made by Google with escaped_fragment= in the request URI, you have to be certain that you understand the behavior that particular request URL is likely to trigger.
So when initial request: yourdomain.com/#!home/?view_3_page=1 is made,
What does: yourdomain.com/?escaped_fragment=home/?view_3_page=1 do?
Side note - it could be: yourdomain.com/?escaped_fragment=home/&view_3_page=1 but as Carson said, without looking at how your side behaves in this situation it's difficult to know so I'll just put the different outcome options in here in case one of them is close.
So, check your server logs and look at how the snapshot request URI is formed. Then check those pages out in a browser - making sure (obviously) you're responding with the right server header response and that the page code makes sense,
What tends to happen (if you've got this far) is that in unusual circumstances (eg: a chain of parameters with the escaped fragment pre-fetch directive bolted in) is that you might be serving malformed versions of what you'd hoped would be your perfectly constructed HTML snapshot.
IF that's the case, I would spend a lot of time evaluating what Google sees and therefore, what it attempts to crawl. You might find that if you're serving something a bit strange then Google might be discovering URLs you didn't know you were capable of generating. That should give you enough scope to detect a problem and get a change request assigned to fix it.
If not, then I suppose Google really is making these URLs up - but honestly, I spend a lot of time trawling through log files and it's been a long time since I haven't been able to find an explanation from the actual code.
As a side note: I'd try to avoid hashbangs in the medium / long term. As soon as they're they're you're committed to a lifetime of supporting them. A much more elegant solution is to use PushState (or $location if you're Angular) but (obviously) continue to serve the snapshot trigger via the meta fragment directive. I'm sure you're quite tired of being told to get rid of hashbangs, though.
Hope that helps?
Richard Baxter
SEOgadget.com -
RE: Issue Using MozScape wtih SEOGadet's Link Excel Extension
Hi Zach
I believe you're all set - apologies that we were distributing an older version of the extension!
All fixed now!
Richard
-
RE: Is there a tool that can take all your backlinks and categorise them into categories?
Hi there,
If it's only a few thousand links, try this:
https://seogadget.co.uk/categorising-your-links/
You can create your own link categories too.
Richard
-
RE: Why isn't Ranking Updated?
Oops!
Hey gang - I think I'd misread the question too. Sorry about that Ally

It's a bit weird that the SEOmoz rank tracker hasn't updated so raise that ticket. Ally's comment on Advanced Web Ranking is spot in - we use it at the GadgetPlex a lot (especially for daily monitoring, we can use it to detect things like Panda updates). Love that tool!
All the best and I hope you get your tracking issue sorted out,
Richard Baxter
-
RE: Mobile Accessibility?
Just my personal opinion - I do think more can be done to serve a mobile friendly stylesheet using mobile user agent detection on most websites. I really think that a "mobile site" on a separate domain is often unnessecary and with the right technical team, developing for cross platform compatibility can be achieved with a good front end / css developer.
Most of the time I wish this is what would happen - though in reality it's hard to find great examples in the wild. Do you guys agree here?
On a second point, I found today's strip on XKCD highly relevant to this discussion!
-
RE: Submitting Same Press Release to Multiple Sources
Hi There
Yeah it's hardly the most valuable linkbuilding exercise BUT - Google's current algorithm kind of lets this technique pass. I don't know for how much longer but right now a few distributed PR articles won't hurt, and with the right anchor text can make a strong domain rank a little better for its target keywords.
Proceed with caution - this approach can be pretty weak but it's worth a test

This might come in handy: http://www.searchenginejournal.com/75-pr-article-submission-sites-to-generate-inbound-links/18052/
I wouldn't bother submitting to the same sites more than once or twice - go for inbound domain diversity with reasonable levels of anchor text variation. If in doubt, use "branded" anchors, such as your website URL / brand name.
All the best of luck,
Richard