Unlikely, as long as they're returning 404 errors you should be OK. Maybe update your disavow file and you should be good to go!
Posts made by GFD_Chris
-
RE: Huge number of crawl anomalies and 404s - non- existent urls
-
RE: Huge number of crawl anomalies and 404s - non- existent urls
It's tough to say without seeing the site. Overall it's unlikely if you don't use that string anywhere. We usually see it more for broken relative URLs. Maybe a third party site is using that string.
-
RE: Huge number of crawl anomalies and 404s - non- existent urls
From what I can tell, this probably isn't the reasons for the drops. I'd go back and ensure that any URLs that changed are 301 redirecting to the correct destination URL. I'd also ensure that no pages that were associated with high volume keywords no longer exist.
For your issue, Google is likely finding some broken URLs, possibly from your internal linking structure. Perform a crawl of the site and see if you can find "Inlinks" to those broken pages. If so, you can work with dev to eliminate the issue.
-
RE: Duplicate content? other issues? using vendor info when selling their prodcuts?
So it's generally not a best practice but definitely more par for the course in the eComm world. It's not uncommon to see sites that strictly use descriptions scraped from manufacturer feeds.
Ideally, your pages will contain 100% unique content. If this isn't possible, I generally advise clients to do the following:
- Find dynamic ways of adding unique content (similar products, categories etc)
- Add review functionality: This creates unique UGC content
- Have a better product page UX than your competitors. Emphasize key information in the design and ensure that all information required to decide on a purchase is on the page.
-
RE: Sitemap use for very large forum-based community site
Agreed, you'll likely want to go with option #2. Dynamic sitemaps are a must when you're dealing with large sites like this. We advise them on all of our clients with larger sites. If your forum content is important for search then these are definitely important to include as the content likely changes often and might be naturally deeper in the architecture.
In general, I'd think of sitemaps from a discoverability perspective instead of a ranking one. The primary goal is to give Googlebot an avenue to crawl your sites content regardless of internal linking structure.
-
RE: Rel=Canonical Vs. 301 for blog articles
If the current plan is to create new product sites, then 301 redirect is probably the way to go. You're right that canonical tags can technically be ignored and 301 redirects will send stronger consolidation signals. The biggest con would be that the information can't exist in two places. So if the parent sites would benefit from having that content as well, then canonical tags should be looked into.
-
RE: Re-direct Irrelevant (high ranking) blog articles?
I agree with you that I would probably leave them up. Redirecting those posts would likely sacrifice your ranking positions as you mentioned.
Your best bet might just be to create a new Google Analytics that removes the entire blog or at least those two posts. For your core reporting, you could just use that segment. That should allow you get the traffic but report your core KPIs on more relevant pages.
-
RE: Using one domain for email and another domain for your website, but redirects...
Nope! Your email domain shouldn't have any impact on your site's SEO.
-
RE: Set Up htaccess File
Hey Bob!
Would be happy to take a look at the project for you. You can email me at chris.long@gofishdigital.com
-
RE: Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
It's tough to tell without seeing the help & marketing pages. How similar are they? Generally, these are different as marketing pages generally talk more about user benefits where help pages are more tutorial based. It's likely as long as they aren't 1:1 matches (or very close) that they can both exist.
In the rare event that the help center pages are an exact duplicate of existing marketing site pages, then in theory you should be able to 404/redirect those pages and not worry about them. If there's already another version of the page on the site than there's no need to manage it in two places.
Feel free to reach out if you have any questions!
-
RE: Root domain change - should we update existing backlinks which include the redirect from the old root domain to the new one?
Hey Luke!
You'll probably get varying answers here. I would say that if you can do it in a scalable time-efficient way, I'd reach out and ask for the domain to be updated. Technically it's better for your backlinks to point directly to your domain. However, 301 redirects should still pass authority.
One thing I'd take a look at is that your old domain appears to use a redirect chain and 302 redirects:
301 http://showmyhomework.co.uk/ 63.35.118.60 server_redirect permanent https://showmyhomework.co.uk/
302 https://showmyhomework.co.uk/ 63.35.118.60 server_redirect temporary https://www.satchelone.com/
302 https://www.satchelone.com/ 54.77.121.55 server_redirect temporary https://www.teamsatchel.com/
200 https://www.teamsatchel.com/ 63.35.118.60 normal none noneI'd try improving this so there is no redirect chain and the 302 redirects are removed. This process could potentially be diluting the equity distribution.
-
RE: Does data-bind hurt SEO?
No problem! A good golden rule of JavaScript SEO is to always SSR where possible. Let me know if you have any other questions!
-
RE: Are Multiple Page Titles hurting my rankings
Overall, it does look like Google is choosing a logical title for your pages: https://www.google.com/search?q=site%3Abackyardadventures.com&oq=site%3Abackyardadventures.com&aqs=chrome..69i57j69i58.3543j0j4&sourceid=chrome&ie=UTF-8&num=100
Your other titles are in the so Google probably wont' use them as the actual title tag. Overall, you're probably OK but might want to have a developer change it if possible.
-
RE: Homepage Ranking Issue
Hey Jamie!
If you're a new site this might take time. Essentially Google needs to recognize your site as an entity which it may not have done yet. This might take some time as users might need to send Google signals by querying "home on the swan" and then adjusting their query and navigating to your result. With our clients, we've seen this can take some time but generally Google will start to rank new sites over time as it collects more behavioral data.
In the meantime, you could try other things to improve your optimization:
- Add to the "Organization" schema you have on the home page and ensure the name property uses "Home On The Swan"
- Internally link to the home page using the anchor text "Home On The Swan"
- Generate backlinks using the text "Home On The Swan"
P.S. I noticed that your site uses Shopify. This guide might also be helpful for you: https://moz.com/blog/shopify-seo
-
RE: Which links to map across in site redesign
Hey Sarah,
You would just need to create redirects for the HTML pages. I wouldn't worry about the JS/images.
-
RE: Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
Hey!
Is there a particular reason that you want the help articles out of the index? That content may be useful to current users who are querying Google for how to use your solution. It also could be useful to potential users who are looking for specific functionalities. We generally recommend that our SaaS clients index this content.
In terms of a time investment, it's probably still important, especially if your existing users are interacting with the documentation. Personally, to start I'd prioritize any items that can be scaled. I might start with:
- Removing the global "noindex" tag
- Fixing mixed content signals
- Removing global 4xx/3xx
- Fixing individual 4xx on your highest traffic pages
Try implementing anything globally first and then working to page-level fixes.
-
RE: Does data-bind hurt SEO?
Basically those tools aren't reading the DOM but Google can which is why it can see your site's title tags, H1s etc. Your site is using client-side rendering which Google can crawl through. Notice how if you go to a given page and click "View Source", none of the page's content appears.
While it appears Google is reading the content in the pages I looked at, I would definitely look into this more to see if Google is able to crawl/index the content on all of your site's pages. Client side rendering is less reliable than SSR so there might be instances where Google isn't reading sections of your content.
-
RE: Phasing in new website on www2 domain - 301 plan
Your proposed redirect strategy looks good. If possible, I would keep the same URL path on the www subdomain. That way, when you're finished, you could simply remove the 302 redirects.
1. I would keep the redirects in place until the new content on the www subdomain is live.
2. Personally, I would avoid using the canonical tag in this situation. Google treats this as a hint and not a directive. If your content is too different, Google might just ignore the canonical tag and index both versions. As well, if you use the canonical tag from the ww2 domain to www subdomain, Google will only view the www subdomain content quality. If your content/UX is better on the ww2 subdomain, you won't receive any of that SEO benefit during that time.
-
RE: Phasing in new website on www2 domain - 301 plan
Got it!
While I've been a pretty heavy advocate against them, this might be a situation where using 302 (temporary) redirects is the best option. The current plan will tell Google:
- The site is permanently moving the content to the ww2 subdomain
- The site is now permanently moving the content back to the www subdomain.
Instead by implementing 302 redirects gradually as the content goes live, you would send stronger signals that this is only a temporary move.
Let me know if you have any questions on this, would be happy to chat more: chris.long@gofishdigital.com
-
RE: My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
In the HTML of your pages there's an <a href="">link with "javascript:void(0". It appears that Google is getting into those. Is possible, remove that link or take it out of an</a> <a href="">element. Otherwise, you should be OK, those pages should 404. </a>