Hi Martijn,
Thank you for responding. I think canonical tags are the best way forward, I am looking forward to explain to the web developer that we need several hundred tags implementing!
Many thanks
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Head of Digital
Company: The Click Hub
Favorite Thing about SEO
Google's logic.
Hi Martijn,
Thank you for responding. I think canonical tags are the best way forward, I am looking forward to explain to the web developer that we need several hundred tags implementing!
Many thanks
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website).
Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file.
Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages?
In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs?
Any help is greatly appreciated!
Hi Jared,
That's very helpful and your response is greatly appreciated! From your experience, what sort of time frame would you expect from implementing these signals and the pages being reindex to seeing an effect in the ranking?
Many thanks
Hi Paul,
DA:10 is reasonably low, however, if the links to the old site are from domains that are relevant to your new website then it can certainly help.
If the old website is no longer live then I'm not sure how the 301s will work as the redirect code would need to be present in the code for that website. If the website isn't being hosted, and so the code isn't 'on the internet' then I'm not sure that the redirect code will be accessible to the crawl bots that identify the links in the first place.
Linking domian > links to > old URL which contains the redirect code > links to > new website.
If the website isn't live then I think this middle step is missing.
The first things I would check are the domain authority of the old website (assuming that it is still live and there is content behind the domain?) and also the domain health to make sure that it isn't blacklisted. Presumable the old website has a good DA and so you want to carry that link juice? From our experience with 301s, this should serve to carry that link juice on the condition that you continue to host +/ serve the old website.
A free tool for this is mxtoolbox.com which is pretty good but I am sure there are others available!
For one of my SEO campaigns, Google is using the website's home page as the landing page for the majority of search terms being tracked. The website splits its products by region and so we want specific region pages to rank for search terms related to that region, rather than the home page. We have optimised each regional page to a reasonably high standard and we have ensured that there is a good amount of internal linking and sign-posting to those region pages, however, Google is still using the home page. The only complication is that for the first few months there were canonical tags on these pages to the home page. These were removed around 3 months ago and we've checked that the region pages are indexed properly.
Is there anything we are missing?
Has anyone had any success in getting Google to change its landing pages?
The first things I would check are the domain authority of the old website (assuming that it is still live and there is content behind the domain?) and also the domain health to make sure that it isn't blacklisted. Presumable the old website has a good DA and so you want to carry that link juice? From our experience with 301s, this should serve to carry that link juice on the condition that you continue to host +/ serve the old website.
A free tool for this is mxtoolbox.com which is pretty good but I am sure there are others available!