Problems with 301 redirects with urls containing strange, uncommon characters on IIS
-
This post is deleted! -
What version of IIS?
Did you installl URL Rewrite 2? http://www.iis.net/download/URLRewrite
-
iis 6.0
the developer provided this as their solution, any ideas?
Here’s a basic overview:Using the Ionics Isapi Rewrite Filter available here:http://iirf.codeplex.com/The code should already be installed on the server, but needs to be set up for each site that is using it.In IIS right-click and select Properties and select the ISAPI Filters tab. Click the Add… button. Click the Browse… button and select the .dll (Default location is C:\Program Files\Ionic Shade\IIRF 2.1\IIRF.dll.) Enter a filter name (“ISAPI Filter” is fine) and click OK.Add an IIRF.ini file to the root of the website. The text file will contain something similar to this:§ remove index pages from URLs RedirectRule (.)/default.htm$ $1/ [I,R=301] RedirectRule (.)/default.aspx$ $1/ [I,R=301] RedirectRule (.)/index.aspx$ $1/ [I,R=301] RedirectRule (.)/index.htm$ $1/ [I,R=301] RedirectRule (.)/index.html$ $1/ [I,R=301] RedirectRule (.)/[Old URL]$ $1/[New URL] [I,R=301]§ force proper www. prefix on all requests RewriteCond %{HTTP_HOST} domain.com [I] RewriteRule /(.*) http://www.domain.com/$1 [R=301]§ RewriteLog c:\temp\iirfThe first block ensures that the default page will never have a page name at the end. It will just use the root or folder name ending with a slash. It also can be used to write Old URL to new URL.The second block forces www. to be added to the domain if it does not exist in the url.If you remove the ”#” from the beginning of the last line the filter will write out debug information to the provided location.And that in essence is how we set it up.
-
Sorry, most of my experience is on IIS7 (for numerous reasons including many performance enhancements. As a sidebar, I'd recommend upgrading before developing and launching a new site on old tech) and with URL Rewrite which has a robust RegEx engine. I did work with Helicon ISAPI Rewrite a long time ago but my memory is vague.
Also, I'd need to know what a clean URL is supposed to look like (hopefully without spaces as those cause all sorts of problems). The samples all fail so I don't know how to fix.
-
This post is deleted! -
You are right, these are &%*#(&d. I don't see any usable pattern given the new URLs must use IDs from the new system (first example). The second also needs new information (category) which no Regex will ever do.
In that case you have a few options:
- Create a search function on the 404 page that will take the relevant part of the URL and perform a search on the new catalogue. This won't be perfect but at least the visitors will get useful information to act on.
- If only a few 100 are important, write a individual rule for each page. This can be done in a XSL file to speed up the process but it is a lot of work. After the top pages are finished, redirect the rest to catalogue home.
- Seriously consider upgrading to URL Rewrite 2 which can perform redirects against a databases of URLs. This is more tedious and expensive but it allows you to add thousands of URL pairs – far more than any txt file should ever hold for performance reasons. You can even write your own logic in the provider to do some of the search functions. Then it behaves like #1 but redirect immediately to the best match.
Good luck!