Thanks Dmitrii,
So will that one line of code fix the issue for ALL products and ALL collections?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks Dmitrii,
So will that one line of code fix the issue for ALL products and ALL collections?
Yes, I've seen the same thing. I've got a client with a small site of only 6 pages. The sitemaps report shows all 6 have been indexed.
Then I look at the Index Status report and find 46 pages indexed. 46 pages indexed on a site with only 6 pages!!
So this seems to confirm your comments Logan.
The question I have is: how can I get a list of all the indexed pages?
Anyone familiar with Shopify will understand the problems of their directory structure. Every time you add a product to a 'collection' it essentially creates a duplicate. For example...
may also appear as:
It's not uncommon to have up to six duplicates of each product.
So my question is twofold:
Shopify instruction #1: Add the following to the theme.liquid file...
<title><br />{{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
{% if page_description %}
{% endif %}
Shopify instruction #2: Add the following to each individual product page...
So, can anyone help clarify:
Regards,
Murray
If I'm about to submit a new sitemap for Google to crawl, is there any need to use the Fetch tool?
Just taken over a new client who recently moved from A.N. other platform to Shopify. I just found reference to their old website IP address and it appears to be not redirecting.
Can I simply use something like Traffic Control (Shopify app) to redirect to the new domain?
What do you say to a client who recently purchased an online business and says 'I don't really care if the phone number or address on a directory is old or incorrect'?I've tried to explain the value from an SEO point of view, but he's not really buying it.Anyone encountered this skepticism before and if so, how did you handle it?
Hey Logan,
Thanks for clarification on the hreflang tags - sounds good.
Re the .eu TLD, the client is keen to go this route just to enable Euro currency in the checkout. But I agree it is sub-optimal. If there is sufficient business case (ie. traffic), then I will suggest .fr, .de, etc. I believe the Langify app in Shopify works well. Otherwise, I might suggest .com in US$ for all countries other than Aus and UK.
Also wondering on your thoughts re domainuk.com, versus domain.co.uk?
Furthermore, it appears that region targetting only works for specific countries ('eu' is not an option).
So it appears that the proposed solution isn't going to achieve the desired outcome after all:
Anyone out there got a solution?
Hi Logan,
OK, there's a possible gotcha here. All four sites (.com, au, uk and eu) will be in English. So content will be identical. The purpose of the hreflang tags seems to be for multi-language versions of the same site. This is not the case here. The primary reason for country specific TLDs is just to allow customers to transact in their local currency, but also to be indexed in the local version of Google.
Make sense?
That's very helpful - many thanks Logan
Here's the scenario:
Question: How do we avoid content duplication (ie. how will canonical tags work in this scenario)?
Thanks Sean,
It's a Shopify site, so I don't think the dev has access to server logs.
I think the alternative approach might work though. I'm not sure if they have a URL export, but if they do I'll suggest they use the list function.
Thx again.
Don't know. That's the problem. The dev doesn't actually know how many pages she might have missed. She's looking for a way to identify exactly how many there are and what they are.
I just had a developer friend call me in a panic, because they had gone live with a new site and found out (the hard way) that they had missed some pages on their 301 redirects. So the pages are appearing in Google but serving 404s. Ouch!
So their question was: other than running a report for 404 errors in something like Screaming Frog, is there a way to hunt down ONLY pages serving 404s, then export to CSV so they can be redirected?
Anyone got any tricks up their sleeve?
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers...
Can anyone shed any light on why they differ so much? And where lies the truth?