Joomla to Wordpress site migration - thousands of 404s
-
I recently migrated a site from Joomla to Wordpress. In advance I exported the HTML pages from Joomla using Screaming Frog and did 301 redirects on all those pages.
However Webmaster Tools is now telling me (a week after putting the redirects in place) that there are >7k 404s. Many of them aren't HTML pages, just index.php files but I didn't think I would have to export these in my Screaming Frog crawl.
We have since done a blanket 301 redirect for anything with index.php in it but Webmaster Tools is still picking them up as 404s.
So my question is, what should I have done with Screaming Frog re exporting to ensure I captured all pages to redirect and what should I now do to fix the 404s that Webmaster Tools is picking up?
-
Hi,
Screaming frog doesn't create redirects. You need to use a mod_redirect or something similar.
Maybe, the best option for your problem it's creating a database of old pages -> new pages, and redirect all connections for unknown pages to these page.
-
I know it doesn't create redirects but I wanted to use it to figure out the list of files / pages to create 301 redirects for and then add these to the HTAccess file. However was I incorrect to just export the HTML files from Screaming Frog as there were only 500 of these but there are now 7000 404s in Webmaster Tools of PHP files.
-
Hi There
Generally those types of 404's won't be too harmful - they sound like they may have been somewhat artificial WordPress pages.
What I would do is get your list now from Analytics or Webmaster Tools - this way you will capture URLs that actually got traffic or Impression in Google and redirect those.
So run a landing pages report, and an top pages report in webmaster tools - maybe for the last 6 months. Create a text file of all the URLs, and run them in list mode through Screaming Frog. Redirect any that 404.
If you were to go back in time, what I would have done with Screaming Frog is - let it crawl everything - you have to allow it to "follow redirects" and "ignore robots.txt" etc - I know Google is not supposed to crawl anything in robots.txt - but basically you'd be letting Screaming Frog get to everything, that way you don't miss any URLs.