Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Express js and SEO?
Hi Allie McFadyen, I am not familiar with using express.js and Angular, but since it is JavaScript based then historically Google has had issues with JS in the past. Now, Google says they can now crawl JS, but still there are issues that might arise that can hinder your sites SEO if not done properly. It seems like the test you have done has shown that it is not crawlable and viewable, but have you tried creating your own test page and then using the "Fetch as Google" tool within Google Search Console? Even if the test comes back and it is crawlable, I would always be cautious using a technology that has questionable SEO results. To this day I am still cautious using JavaScript in certain sections of a website, even though Google says they can crawl it. Hope this helps. Let me know if you have any questions. Regards, Kevin
| SurgeStream2 -
Tough SEO problem, Google not caching page correctly
It seems you have a system which redirects users to the default page. When I try the site http://www.mercimamanboutique.com/ - It has a canonical http://www.mercimamanboutique.com/fr-fr/ - when I switch to German and go back to the same url it has a canonical http://www.mercimamanboutique.com/de-de/ The site seems to exist in https & http - may be better to redirect everything to https (although this is probably not related to the issue you encounter) Dirk Update: I also noticed that the main rel alternate doesn't exist - http://www.mercimamanboutique.com/ is redirected to http://www.mercimamanboutique.com - I guess it's better not to use url's that are redirected but to use the final destination url
| DirkC0 -
Can google bots read my internal post links if they are all listed in a javascript accordian where I list my sources?
According to a test done in Search Engine Land, Google bot can crawl and index those links in Javascript. So you should be fine.
| JordanLowry0 -
Content incorrectly being duplicated on microsite
I would like to understand if this is a google indexing issue or is it a back-end dev issue?
| Discovery_SA0 -
Is it worth re-structuring URLs if breadcrumbs are enabled?
Hey Leigh! Can you clarify if the search engines are crawling the /category/article-title/ URLs or if your breadcrumbs just hint that it is the structure? Feel free to PM me the site if you want. Whether it's worth restructuring or not to add /category/ to the URLs depends on a lot of things. It's definitely helped on ecomm sites and I've recently done this with a marketplace I've worked with, which has really helped with rankings and longterm will really help with traffic. But you need to weigh it and the different considerations - all the redirects that need to take place, updating sitemaps, internal links, etc - against other things you could do to improve your rankings like building links and developing new content, as well as generally speeding up your site if needed. URL changes are never something to take lightly!
| dohertyjf0 -
Is there an percentage of duplicate content required before you should use a canonical tag?
Hi Dan Thanks for coming back. Great, so canonicals will be taken into consideration regardless. Do you have any suggestions on how to mitigate the negative effects of duplicate content if it is unavoidable? Hypothetically. Excluding canonical tags? Thanks again.
| punchseo0 -
Impact of Medium blog hosted on my subdomain
Thanks John, The right decision is clear to me now. -Dave
| davidevans_seo0 -
Old Content Pages
Hey Cole, Good question. There are a couple of different routes you could take with the old or expired content. Leave as is, redirect to more relevant current content or remove it entirely from your site. My recommendation would be to redirect these old pages to the most relevant updated content unless they are receiving any significant amount of traffic. If the traffic is at 0 and the page is indexed, you are probably better off increasing the page authority of a more recent article or page by using a 301 redirect. You'll also want to ask what is the best option for user experience, it may be beneficial to have these posts in a searchable archive for users. There is a great article describing the options in a post on the Moz blog entitled "How Should You Handle Expired Content." Hope my recommendation and the article help you make a decision. Best of luck with organizing your site.
| Chris_Hickman0 -
Is this duplicate content that I should be worried about?
From a UX point of view, it's a little deceiving that the link says 'Full Description', but then it's not any longer than the snippet up by the button. This could have a minor impact on SEO. I'd either shorten the snippet, or expand the description below in the tab. Overall though, from an SEO perspective, this isn't much different than you'd see on Amazon where they have the bulleted list up top, then a written out description below.
| LoganRay0 -
Transferring website onto a new domain and it has disappeared in the rankings
Thank you everyone. I've followed all the good practice suggestions etc. It's still WAY back in the rankings but is making progress!
| jamiericey0 -
Schema Markup for property listings (estate agent)
Maslavista, That response is a keeper! I wish there was some way within Moz to create a file of favorites.
| julie-getonthemap0 -
Alt Tags - how important for SEO?
Thanks everyone, I'll prioritise important pages and see how we get on with rankings
| BeckyKey0 -
2 sitemaps on my robots.txt?
We recently changed our protocol to https We have in our robots.txt our new https sitemap link Our agency is recommending we add another sitemap in our robots.txt file to our insecure sitemap - while google is reindexing our secure protocol. They recommend this as a way for all SEs to pick up on 301 redirects and swap out unsecured results in the index more efficiently. Do you agree with this? I am in the camp that we should have have our https sitemap and google will figure it out and having 2 sitemaps one to our old http and one to our new https in our robots.txt is redundant and may be viewed as duplicate content, not as a positive of helping SEs to see 301s better to reindex secure links. Whats your thought? Let me know if I need to explain more.
| MonkStein0 -
Assistance with High Priority Duplicate Page Content Errors
Jill You have a problem in what is showing as a page and what is not. If you will PM me I will endeavor to assist you. You have /wp-content/.../photo showing as pages. There are at least 6 pages using the title tag - Rejuvalon Skin and Spa | - All of the SEO could use a bit of help. I am happy to assist you with it but will need access to the dashboard. (No charge, you can help a homeless person for Christmas if you feel the need to repay it). Best
| RobertFisher0 -
Organic Sitelinks
Hi there, The first search is for your exact brand name, meaning that there's a huge likelihood of that particular search being for your brand and nothing else. Because of this, Google can figure out that the searchers intention is to find you, and will show more of your website in first position. The second search for 'cars ireland' is a lot more generic and might be searched by users that aren't familiar with your brand. For this reason, Google wouldn't show your organic sitelinks because you're unlikely to be the best result for the majority of these searches. I hope that helps, Sean
| seanginnaw0