Questions
-
How to handle potentially thousands (50k+) of 301 redirects following a major site replacement
Sorry to hear of your woes. Depending on the structure of the URLS you could create some simple pattern matches rules within .htaccess? If you could a few dozen rules could handle many thousands of redirects. If there isn't any easily identifiable pattern to match then a DB will, indeed, be your best option. One of the web devs I used to work with (who was considerably smarter than me) faced a similar issue (with a 'mere' 10k+ redirects) and used some Ruby on Rails middleware as a redirector: This may have been the solution he used: https://github.com/vigetlabs/redirector I hope that helps. I hope you're able to get this sorted without too much pain. Good Luck!
Intermediate & Advanced SEO | | Hurf0 -
Use of 301 redirects
You can use POST instead of GET and rid yourself of the query string all together if you want. If each url produces unique content you want indexed then you need to look at creating friendly urls's, if you are hosting on a IIS server then you see how this is done here., half weay down the page. http://www.seomoz.org/ugc/microsoft-technologies-and-seo-web-development
White Hat / Black Hat SEO | | AlanMosley0 -
Meeting Google's needs 100% with dynamic pages
Google hates search results pages and it is usually difficult to rank them. I've seen some bigger brands and sites get away with it, probably because of their history and authority though. I don't suggest using them. I suggest creating permanent pages with clean URLs and using those instead. That means that 301ing that ugly URL with all of the search parameters in it, to a cleaner URL is an option. Or just build the page statically and create the URL, rather than having a 301 to it.
Intermediate & Advanced SEO | | MiguelSalcido0