You could use "allow" in your robots.txt file for just this problem.
allow: news/events-calendar/usa/event-name
disallow: /news/events-calendar/usa
See the allow directive section of this page: https://en.wikipedia.org/wiki/Robots_exclusion_standard
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Organic Web Strategist
You could use "allow" in your robots.txt file for just this problem.
allow: news/events-calendar/usa/event-name
disallow: /news/events-calendar/usa
See the allow directive section of this page: https://en.wikipedia.org/wiki/Robots_exclusion_standard
The original poster's situation sounds like merging two sites rather than moving a site from one domain name to another. Matt Cutts specifically recommended against using the Change of Address tool for merging sites in this video: https://www.youtube.com/watch?v=s6pyAWJ5BRs
Having been in a similar situation before, I definitely vote 301 redirect asap, _provided that the old domain has a clean backlink profile._The last thing you want to do is tank organic traffic by adding the old site's spam links to the main site.
Make sure both sites are in Google Search Console, and neither has any manual action notices. Then go into "links to your site" and download all links. Get links from Moz's Open Site Explorer and Bing's Webmaster Tools as well (and any other tools you have); You want as complete a picture as possible. Once you have as many links as you can find, start auditing manually. After a while it will become easier to spot spam links, as they tend to have an easily recognizable pattern.
If you find spam links, remove and disavow before you redirect. Note that you'll have to disavow bad links to the old site **on the new site's disavow file **when you redirect it, as that's where Google will look for it. If you don't find spam, redirect.
I wouldn't let the site rot. That would be a potential security issue and a customer service nightmare.
That makes perfect sense! Thanks Gianluca (hope to see you at Mozcon again this year btw!).
Hi everybody,
So I'm setting up hreflang tags on an ecommerce site. The sites are in the USA and Canada. The Canadian site will have fewer products than the American site, meaning that there won't be as many pages in each category as there are on the American site. What is the correct way to handle hreflang tags on these extra category pages?
To put this another way, the American site may have a category with 3 pages of products, while the Canadian equivalent only has 2 pages of products. What happens to this extra American category page (example.com/widget-category/page-3) ?
Does it get an hreflang tag linking to the first page of the equivalent Canadian category (example.ca/widget-category/)?
Does it not get any hreflang tags because it has no true Canadian counterpart?
Does it matter at all if it has a canonical tag pointing to the first page in the series anyway (example**.com**/widget-category/)?
Thanks,
Andrew B.
Hello,
You mentioned that none of the pages have noindex/nofollow tags, but have you also checked your robots.txt file to verify that they aren't blocked there?
-Andrew B.
Hello,
You mentioned that none of the pages have noindex/nofollow tags, but have you also checked your robots.txt file to verify that they aren't blocked there?
-Andrew B.
That makes perfect sense! Thanks Gianluca (hope to see you at Mozcon again this year btw!).
You could use "allow" in your robots.txt file for just this problem.
allow: news/events-calendar/usa/event-name
disallow: /news/events-calendar/usa
See the allow directive section of this page: https://en.wikipedia.org/wiki/Robots_exclusion_standard