Questions
-
Website Redesign & Ensuring Minimal Traffic/Rankings Lost
Having performed maybe upwards of 80 & without any real traffic loss for more than a week. It is because I follow the rules very thoroughly when you get to the bottom of this how do you please use one of the crawlers mentioned use a complete search and replace when necessary across the entire site just to make sure everything’s in place. I don’t know what type of website you’re running however if it is WordPress or if you I want toget some extra traffic I would make sure that the blog is a sub folder. if it is WordPress you can do this on a managed managed host platform like Pagely , ServeBolt or Kinsta for just $50 a month. example.com/blog Redirect mapping process If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected. The redirect mapping file is a spreadsheet that includes the following two columns: Legacy site URL –> a page’s URL on the old site. New site URL –> a page’s URL on the new site. When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page. Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong. Increasing efficiencies during the redirect mapping process Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping. Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping. https://moz.com/blog/website-migration-guide Appendix: Useful tools Crawlers Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites. Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations. Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites. Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site. On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues. Handy Chrome add-ons Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc. User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents. Ayima Redirect Path: A great header and redirect checker. SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector. Scraper: An easy way to scrape website data into a spreadsheet. Site monitoring tools Uptime Robot: Free website uptime monitoring. Robotto: Free robots.txt monitoring tool. Pingdom tools: Monitors site uptime and page speed from real users (RUM service) SEO Radar: Monitors all critical SEO elements and fires alerts when these change. UltraDNS TOOLS change to DNS Site performance tools NewRelic this is by far the most comprehensive site performance and site measuring tool listed. However the price is very steep it’s my favorite tool doesn’t mean it’s required. PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points. Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module. Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts. DareBoost very helpful & accurate as well. finding everything you need to know. Structured data testing tools Google’s structured data testing tool & Google’s structured data testing tool Chrome extension Bing’s markup validator Yandex structured data testing tool Google’s rich results testing tool Mobile testing tools Google’s mobile-friendly testing tool Google’s AMP testing tool AMP validator tool Backlink data sources Ahrefs Majestic SEO Moz I hope this helps,Tom
Web Design | | BlueprintMarketing0