Adding a cross-domain canonical tag like this is fine assuming you are doing it for customer service (and the manufacturer doesn't mind you copying content from their site). You won't see any SEO benefit from the content on those pages because they are unlikely to be indexed. On the other hand, it wouldn't hurt your site either.
Posts made by LauraSultan
-
RE: Duplicating content from manufacturer for client site and using canonical reference.
-
RE: Robots txt. in page with 301 redirect
In that case, you'd need to add the robots meta tag at the page level before the tag.
or
-
RE: Robots txt. in page with 301 redirect
So, the problem is that the robots.txt file can't be accessed because of the 301 redirect to the new domain?
Do you plan to keep the help files on the old domain, or will they be removed completely?
-
RE: Robots txt. in page with 301 redirect
Do you want to disallow the URLs that these pages are being redirected to? If not, there's no need to add anything to the robots.txt file.
If you do want to disallow the URLs that these pages are being redirected to, use relative URLs in your robots.txt file. For example, let's say olddomain.com/old-help-page/ is being redirected to newdomain.com/new-help-page/. If that's the case, add the following to your robots.txt file.
Disallow: /new-help-page/
There's no need to disallow the specific URLs that are being redirected to something else. Are you trying to get them removed from Google's index or something? If so, Google will update their index eventually based on your 301 redirects.
-
RE: Best practice for Wordpress /page/2/
We usually noindex the archives subpages and sometimes the main blog page itself (if it isn't the home page). I'd prefer to see the individual blog post pages in search results than page 2 of blog index page. Depending on how your blog is set up, you run the risk of an internal duplicate content issue. The only way to guarantee that the search engines prioritize the single post page is to noindex the blog index pages.
-
RE: Correct keywords Anchor text for links passing
A common post-Penguin misconception is to avoid using important keywords in anchor text at all. You should avoid using the same keyword-stuffed anchor text in a bunch of backlinks to your site, but it's still a best practice to include important keywords in anchor text. Just don't be spammy about it. Make it natural and vary the anchor text. It shouldn't be a problem on your own website as long as it isn't done in an unnatural way that isn't helpful to the reader at all.
-
RE: Two Domains, Same Products/Content
Is there a particular reason that you don't want to 301-redirect the old domain to the new domain? If customers go to the old domain or even search the domain name, they'll be redirected to the new one automatically. Can you help us understand why this is not ideal?
-
RE: Home Page not Ranking on Local Community Sites
Any chance you can give us the URLs? Otherwise, it's difficult to troubleshoot. Does this only happen in Google, or do you see it in Bing as well?
Based on what you're saying, I would look first at at a few possible culprits.
- Index Issues - Have you checked to confirm that the home page is indexed by the search engine? Search site:domain.com to check. If not, you may have a technical problem that prevents the
- Internal Link Scheme - Is the other page getting more internal links than the home page, making it seem more important?
- Content - Does the other page have loads more relevant content than the home page?
-
RE: Nofollow "print" URLs?
Yes. I suppose if you have a massive site and you're seeing a problem with crawl budget, it wouldn't hurt to nofollow the print links. Otherwise, it isn't really necessary.
-
RE: Nofollow "print" URLs?
You are correct to noindex it, but I see no reason to nofollow.
-
RE: Site traffic/sales have plummeted
This page is set to "noindex," so you've essentially told Google not to index it.
-
RE: Site traffic/sales have plummeted
Laurie, when you say that your most valuable pages are nowhere to be found, can you provide an example?
-
RE: Site traffic/sales have plummeted
It's really a shot in the dark without knowing the website URL, but there are probably multiple issues going on. If Google is only indexing half of your pages, it's probably a crawl issue or a meta robots issue. Check to make sure the pages aren't being blocked or noindexed. If that isn't the problem, you may have a problem with crawl inefficiency.
-
RE: Site not getting indexed by googlebot.
Do you mean that it isn't being crawled by Googlebot? A Google search for site:footeschool.org reveals that it is, in fact, indexed.
-
RE: Interesting Cross Domain Canonical Quirk...
That's the way it should work. When you set up a cross domain canonical from a URL on domain 1 to a URL on domain 2, you are telling the search engine that you want the content on site 2 to be indexed rather than the same content on site 1. The page content on domain 1 is probably not in the index for search results anymore, but the canonical tag ties the content on the two domains together.
In your example, does the search results link to the content on domain 2? That's what I would expect.
-
RE: How to approach SEO for a national umbrella site that has multiple chapters in different locations that are different URLS
It's really going to depend on the search query. Some search queries are seen as having local intent regardless of whether or not a geo-modifier (i.e. "Chicago") is included in the search. Since your client's site is for the national organization, or the main website for the brand, you'll likely outperform for some queries based on brand strength, backlinks, domain authority, etc., than the localized websites. On the other hand, if someone searches for "summer day camp," that searcher is likely looking for a local service provider.
Are you specifically searching for the query "dog safety" in Chicago for the example you provided? That seems like more of an informational query, so your site would need to be seen as a topical authority to perform well.
You say that your .net site uses a search function (zip code search?) to display the individual locations. Is the location info crawlable by the search engines, or is it hidden behind the search functionality? If you want the .net site to perform for local searches, you'll need to work on optimizing local landing pages that can be easily found and indexed by search engines.
-
RE: Any insight on optimizing a single URL for locations in different states?
Yes, I have seen that work as well. I'm not saying that you can't do it. but those are highly competitive keywords in large metropolitan areas. It will take longer to see results. Local landing pages will work to build authority for the entire domain for those locations. I have seen this happen many times with our clients. Both the optimized local page and the site's home page can end up ranking well for geo-targeted keywords.
-
RE: Any insight on optimizing a single URL for locations in different states?
You'll be fighting a steep uphill battle if you try to optimize one URL for all three. You should, of course, mention that you have offices in all three cities on your home page, but why not create local landing pages for each city?
I don't mean that you should create one page, copy it, and replace the city name. That would be bad.
Each city page should have unique content with a local focus. In addition to contact information and directions, there's probably plenty of ways to add unique content to each local page. Highlight key staff members for each location, add location photos (inside and out), add customer testimonials, etc.
More about location pages:
-
RE: Homepage not ranking for main keyword, all other pages ranking slightly for their own keyword phrases.
First of all, It's unrealistic to rank well for a highly competitive single keyword like "pashmina" within 3 weeks of launch with no backlinks - even within the first 100 results. Your other pages are doing better because "white pashminas" and "white cashmere pashminas" are not as competitive. Sites rarely jump to the top of the rankings right after launch. There may be a bump at first as Google indexes the site, but then it drops back down.
Secondly, you state that you haven't done any link building, and I assume you mean that you haven't done any spammy link building. However, you still need good backlinks to your site to be competitive. Below are a few resources on link building.
-
RE: What is considered duplicate content?
This type of duplicate content is common on ecommerce websites, and it isn't necessarily a big problem. However, given the fact that there will be a higher percentage of duplicate content than unique content, you run the risk of some of your pages being omitted from search results for certain queries. If that is the case, searchers will see "In order to show you the most relevant results, we have omitted some entries very similar to the (# here)already displayed. If you like, you can repeat the search with the omitted results included."
This isn't really a penalty. It's just Google being efficient with their algorithm. It shouldn't be a problem for highly targeted searches, but you may lose a little search visibility for more generic searches.
My advice is to get creative and find new ways to add more unique content to your product pages. Add testimonials, user-generated reviews, camper van adventure stories, etc.
You are right that canonical tags are wrong for this situation. Using an iframe doesn't make much sense either. Google has stated that they try to associate iframe content with the page it's embedded on anyway.
Further information: