Questions
-
Topical keywords for product pages and blogs
I wouldn't call these 3 pages 'duplicate content'. The way I see it is that it's fine to have a topic cluster where several pages/blog posts cover the whole topic. The key is to help Google understand which is the most relevant page for the head term. To know which page Google thinks is the most relevant for that term currently, do a {site:website.com "floor heating"} search and the first result is the page with the most relevance in Google's eyes. If it's not the product page, then I'd work on the following items: Make sure that all floor heating posts link back to the floor heating product page with 'floor heating' as the anchor text. Test different title tags on the product page including a test where 'floor heating' is in the title tag twice. [*gasp! Did he just say that?] Yes. I did. And it doesn't hurt to test it. Just monitor the CTR as well in Google Search Console. Test adding a paragraph or two of content to the product page and, yes, do include the keyword 'floor heating'. It can be placed below everything else. Just monitor the conversion rate of the page of course. If it doesn't hurt conversions but helps rankings, then it's a win. With all these tests, you can always revert back to the pre-test state without hurting anything. Create more blog posts that answer more floor heating questions for this topic cluster, making sure to link back to the topic head product page accordingly. Get a few more backlinks to the product page with some keyword rich anchor text. I'm not saying that all your links should be keyword stuffed but I've found that since Panda, SEOs have gone too far the other direction where almost none of their links contain anchor text. Good luck! Boyd
Intermediate & Advanced SEO | | Nozzle0 -
How to handle images (lazy loading, compressing, caching...) to impact page load and thus SEO?
Image Format I believe the preferred performance is WebP. But I usually try to use png over jpg. Compressing Images I think what might work best in your case is some sort of plugin like WP Smush. If you have a ton of images I'd invest in a tool or plugin that dynamically compresses images as they are uploaded to your site. I like WP Smush because it also strips out the metadata associated with images along with compressing them. If you have a ton of images it could be an attractive solution for you that you can scale. Outside of another plugin, you could try some sort of cloud-based solution to dynamically compress images before you upload them. I've tested an open-source image compression tool called Caesium in the past. This tool reduced some of my images by almost 40%. It performed better than the plugins I was using but I'm not so sure it would be a scaleable solution for you. Out of curiosity, how bad are your load times? Are you currently running into site speed problems or are you trying to make incremental improvements?
Intermediate & Advanced SEO | | JordanLowry0 -
Robots.txt blocked internal resources Wordpress
Thanks for the answer! Last question: is /wp-admin/admin-ajax.php an important part that has to be crawled? I found this explanation: https://wordpress.stackexchange.com/questions/190993/why-use-admin-ajax-php-and-how-does-it-work/191073#191073 However, on this specific website there is no html at all when I check the source code, only one line with 0 on it.
Intermediate & Advanced SEO | | Mat_C1 -
Homepage with and without language subfolder
It is the "normal" way to do it, often times pages are looking for region or browser language and redirect, but thats not the question here. Yes you may need to take care about some links you have control about in any way (yelp, linkedin, socials - whatever it is). But I bet you link to all languages. Your Homepage for example links to the other language-versions of the homepage. So the "juice" you thought is going to one language, would be sent to the other languages in smaller portions. How it should be, so no Problemo..
Intermediate & Advanced SEO | | paints-n-design0 -
Block session id URLs with robots.txt
Hi Martijn, Thanks for the answer. Regarding the forward slash in the beginning, is it necessary to use this? In the robots text from Zalando for example, you can see that they don't use it for a lot of filters.
Intermediate & Advanced SEO | | Mat_C1 -
How do internal search results get indexed by Google?
Firstly (and I think you understand this, but for the benefit of others who find this page later): any user landing on the actual page will see its full content - robots.txt has no effect on their experience. What I think you're asking about here is what happens if Google has previously indexed a page properly with crawling it and discovering content and then you block it in robots.txt, what will it look like in the SERPs? My expectation is that: It will appear in the SERPs as it used to - with meta information / title etc - at least until Google would have recrawled it anyway, and possibly for a bit longer and some failure of Google to recrawl it after the robots.txt is updated Eventually, it will either drop out of the index or it may remain but with the "no information" message that shows up when a page is blocked in robots.txt from the outset yet it is indexed anyway
Intermediate & Advanced SEO | | willcritchlow0 -
Same subcategory in different main categories
I would have no issue with using rel canonical links in this kind of situation where you cannot control the underlying CMS to the extent that you would need to entirely avoid the duplicate URLs. The only real risk in my opinion is in the canonicalisation not being respected, but if these are essentially exact duplicate pages, I think the risk of that is low (and even if that were the case, the impact would be relatively low too). Good luck!
Intermediate & Advanced SEO | | willcritchlow2 -
Important category pages that can and should be found in SERP but can not be reached by navigating on the webshop itself
In terms of converting traffic into buyers, it is most likely not a problem that visitors can't click on those 5 main category pages from the homepage. In terms of ranking those 5 pages, it would be better to link to those 5 category pages within the navigation so they receive a link from every page of your site instead of just from pages within the subcategories that are under each main category. These 5 pages are potentially missing out on thousands of internal links (depending on how big your site is) because of how it is setup now. Also, each of these pages are missing out on a link from the homepage which is your page with the highest authority. To give these pages the best chance possible to rank for their targeted keywords, you should make this change asap.
Intermediate & Advanced SEO | | Nozzle0 -
Temporarily redirecting a small website to a specific url of another website
This all comes down to the fact that, technically 302 has always been 'found', but there was no status code for a temporary redirect so Google advised people to use 302 (as no one really ever used it for its intended purpose) Now you have 307. To this day, you can still use 302 or 307 (we're still in the transition period, where both still function identically) A 301 will gradually transfer SEO authority from one page to another, over a few weeks / months - so that the old URL stops ranking and the new URL 'has a chance' of ranking in its place. If the new URL has highly dissimilar content (in machine-terms) then the 301 fails to transfer a portion of the authority and some is 'deleted' (vented into cyberspace) A 302 retains the ranking benefit on the old page and nothing is transferred to the new page (period). Over time (a month or six) the 302 will decay. Slowly the authority (which has been kept on the old URL) will begin to 'die off' and you end up (in an extreme situation) with no authority left anywhere from that particular URL (it's just gone). 307s function the same way As such, using a 302 or 307 is the correct measure, but remember - Google will be watching to check that the redirect really is temporary. If your whole company forgets about restoring the content to the original URL (for a significant period of time) then don't expect that there will be anything left when you come back In an ideal world, you'd turn it all around inside of one month if you wanted some good juice left when you lifted the 302 / 307
Intermediate & Advanced SEO | | effectdigital1 -
Keyword cannibalization
Hi, Thanks for the answer and sorry for the late response. I understand what you mean, but I still have the following question: there is no possibility to add extra content to the blog category pages, unless through source code. This means I can not add extra text on these pages, so these blog category pages just sum up all the different blog articles of which the titles are H2's. Will these pages ever rank, because it is not really unique content about one subject? Thanks!
Intermediate & Advanced SEO | | Mat_C0 -
Multilingual webshop SEO
Huh. You have a really good head on your shoulders. They are both good options. Unless there is something that makes the .lu/fr imperative to have for the users, I think I prefer option 1. Your logic sounds good there and is customer first. Good luck!
International Issues | | katemorris1 -
Video titles and descriptions
I wanted to say that Vijay's answer was fantastic! To add to it - you do want to keep a close eye on if the video adds to the time on site or no once you add it to the page. Some videos may actually do more harm than good with time on site and conversion. Thanks!
Intermediate & Advanced SEO | | JohnSammon0 -
Webshop landing pages and product pages
Hi Roman, Thanks for the answer. I go into this question again, since just adapting the taxonomy and possibly adding some tags doesn't resolve the problem. It improves the UX, and creating a new structure was part of the planning anyway. I just think we will be missing out on a lot of traffic because there are a lot of high volume keywords with low difficulty that are only applicable for the product itself (and not for a category, subcategory or tag). There will be a copywriter assigned to write descriptions for every product anyway, exactly because of the fact that these products are so specific and need more explanation. If we take the keywords for a specific product and integrate them in the product description, I think we can surely rank with these product pages. Do I see this wrong? Thanks!
Intermediate & Advanced SEO | | Mat_C0 -
Subdomain cannibalization
Hello Mat, I don't think I'm seeing the same SERPs as you. Is there any way you could give me an example of one of these subdomains? And yes, you're absolutely right that the same problem of keyword cannibalization would apply to subdirectories as well. If it's the woltersk....lu domain I am getting non-secure warnings from Firefox when I try to access it. How many different subdomains are there / will there be? Is it just shop.domain.lu and www.domain.lu or are there others? I didn't see any for "courses." or "software." in the SERP example you provided with the link. If it's just one, I think that's manageable. For example, maybe www. could focus on informational queries (e.g. JavaScript course) and shop. could focus on transactional ones (e.g. Buy Acme JavaScript course). Maybe one could focus on reviews and comparisons, or long-tail queries while the other focuses on short-tail queries. Without knowing more about the domains and your business, it is difficult for me to say. If you have three or four subdomains all going after the same keywords, that's definitely a problem and I don't think you can avoid cannibalization. At that point, it would be best to choose the strongest domain/subdomain and focus your efforts on ranking one of them instead of watering down your efforts over several.
Intermediate & Advanced SEO | | Everett0 -
Links and images in custom css are not recognised by Moz Toolbar or Yoast plugin
Hi there! Thank you so much for the great question! I just want to make sure I get you the best answer I can- are you not seeing pages get crawled in your Campaign? Or are we not detecting internal or external links when using MozBar on your site? Our tools are designed to search for html links coded as href links in your source code so that may be the problem here. That being said, I'd love to do some more digging on this for you to see what's going on! Can you send the website address you're working with on over to help@moz.com? Looking forward to hearing from you!
Other Research Tools | | meghanpahinui0 -
Text that appears when hovering over navigation tabs
Hi Kane, I tried this method, and I can indeed change the titles but I can not select to not show them at all. I tried adding a space, and in Chrome no title texts show then, but in Edge and FF, a blank space is shown when hovering over the navigation tabs. I suppose there is only one way to not show title texts at all, being an override in the code. Unfortunately, I am not a coding expert. I chose to fill in some appropriate title texts, which also helps (a little bit) for my SEO.
On-Page / Site Optimization | | Mat_C0 -
Crawling issue
Hey, Thanks for reaching out to us! You can create a Campaign solely for a subdomain or subfolder by selecting the +Advanced setting in the Campaign set-up; just click the check box there and it will limit our Campaign audit to the pages on that specific subdomain or subfolder. From there, you can see in your Campaign Setting if you've set it up for just that chunk of your site, or for the entire root domain. I've got a guide to this process that I think may help. With regard to your second question, feel free to reach out to help@moz.com so that we can take a closer look Looking forward to hearing from you, Eli
Getting Started | | eli.myers0