Questions
-
Robots.txt in subfolders and hreflang issues
Hi there! Ok, it is difficult to know all the ins and outs without looking at the site, but the immediate issue is that your robots.txt setup is incorrect. robots.txt files should be one per subdomain, and cannot exist inside sub-folders: A **robots.txt **file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers From Google's page here: https://support.google.com/webmasters/answer/6062608?hl=en You shouldn't be blocking Google from either site, and attempting to do so may be the problem with why your hreflang directives are not being detected. You should move to having a single robots.txt file located at https://www.clientname.com/robots.txt, with a link to a single sitemap index file. That sitemap index file should then link to each of your two UK & US sitemap files. You should ensure you have hreflang directives for every page. Hopefully after these changes you will see things start to get better. Good luck!
Technical SEO Issues | | Tom-Anthony0