Best solution to get mass URl's out the SE's index
-
Hi,
I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there.
So say for example the problem URL's are like
www.mysite.com/incorrect-directory/folder1/page1/
It seems I can correct this by doing the following:
1/. Use Robots.txt to disallow access to /incorrect-directory/*
2/. 301 the urls like this:
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/301 to:
www.mysite.com/correct-directory/Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.
-
Option 2 is preferred.
You definitely do not want to use the robots.txt method. In general, avoid using robots.txt unless there are no other options.
Whenever your site's visitors have a link to an invalid URL, 301 them to the correct URL if you have the content they are seeking. It creates the best user experience and the best SEO results.
-
Cheers Ryan.