Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Great Information It's Working For Me Thankyou.

    | Trymybest
    1

  • Thank you so much for replying to me. Sorry I’ve just realise I’ve made a mistake in my first comment. We are using .com for our main site and we plan to add subfolders for individual countries in the future. Currently, we only have /row for all the countries outside of the UK that we deliver to. Thanks again for getting back!

    | kirstybethkb
    0

  • Hi Daniel, Yes, I have read the entire article. Instead continuing to debate how much link equity in particular would pass through a homepage 301, or how reliable Google's "official stance" is, my goal is rather to evaluate the how the homepage holistically relates to the overall international SEO and corporate branding strategy, in tandem with a complex redesign that is already going to require us to switch up a ton of pages. Thanks!

    | mirabile
    1

  • Hi Dan, I was wondering how this all turned out for you? I'm in a similar boat right now.  Any advice you can share? Thanks, Tim

    | tldev
    0

  • Let's talk you can call me here? https://moz.com/community/users/383525 On the right side under "Additional Contact Info"

    | BlueprintMarketing
    1

  • Hello Martijn, Yes, I did put the pages through the Structured Data Testing Tool & no issues were highlighted. So I dug around & all the articles pointed to WordPress plugins. Since then the pages are no longer flagged as having the same issues vis GSC, so I'll leave things alone for another month or see, I could be an error with Google new report, not the website.

    | jasongmcmahon
    0

  • First of all those more detailed URLs should have been handled via canonical tags and not via robots.txt You are probably safe to allow the detailed URLs to rank, try allowing a sample of them to rank whilst keeping others disallowed. First, fix the architecture. Stop using robots.txt and on the detailed URLs, make them canonical to their parents Once that is done, select a volume of the detailed URLs as a test. Remove the canonical tags from those URLs, allowing them to index. Do they start ranking, performing? Do you get duplicate content warnings? Depending on the outcome, you may want to lift the canonical tags from all detailed URLs, or even reverse the canonicals so that the detailed pages have ranking preference

    | effectdigital
    0

  • Hello, I'm getting back to you about a point in your answer : "So you have to make clear that it is not 90% branded traffic , it should be more unbranded keywords.  In this case - before I would do anything, I would try to figure out why subfolder 2 is ranking for these terms and why subfolder 1 is not." I understood and one of the main Keywords is "muebles" (Furniture in english) but the pages are really similar. Except for the basic "Title, Meta, H1, etc" are there others characteristics or feature I should look into ? Thank you Maxime

    | Sodimaccl
    0

  • It's really unfortunate this has happened to you! One thing you might want to check for is compromised / hacked content as well, since here in the Moz community (and in-fact the wider SEO community) we are seeing increased instances of negative SEO combined with hack attacks Another thing, you have to be really careful when updating your disavow file. If you don't download the existing disavow file first and add your entries on to it, then all of your prior disavow work will be undone. When you upload the disavow txt file, that is your COMPLETE disavow, not simply some additions to it. So if you handled that wrong, you may have disavowed the new stuff and un-disavowed loads of legacy work (which could land you right back at square 1) In general, disavow work does not result in increased SEO performance. If a load of sites were giving you boosted SEO metrics, and then Google decides they are bad and nullifies them, doing a disavow doesn't magically make Google give back SEO authority that you should never have had in the first place. Those pipelines are severed Have you considered that in its early stages, a negative SEO attack often actually boosts rankings? Then, when the network is uncovered, that juice is all cut off. Maybe the past positive performance, wasn't all down to you. The negative SEO attack (before it reached critical velocity) may have been boosting your rankings. Now that it's flipped negative, most of that SEO authority will be cancelled out. That could be a bitter pill to swallow, but I have seen it a lot of times Maybe this drop in performance, is actually where the site should be ranking under your own efforts. If the negative SEO attack initially went undetected and boosted certain rankings, then was discovered and spun around - then actually you are where you should be. Could be complicated to explain to an enraged client though The reason you process a disavow file, is to list all the spammy links and disavow them, so that Google will not (in the future) give you a manual penalty which nullifies all (or most of) your rankings. Since you don't know which links Google currently does or does not consider to be spam, this inevitably results in a few non-spammy links (which were contributing SEO authority) getting nullified, which makes results go down NOT up. What a disavow does, is trade a very small amount of your current performance away, in return for mid-term insulation against manual actions (which believe me, are truly horrible) It never ceases to amaze me how people don't understand what disavows are truly for and how many people say "hey I did a disavow and I saw slight drops and no gains". It's only under very exceptional circumstances (like manual actions) where disavows can potentially (if they are accompanied with linked Google Word Docs, explaining in detail what has happened to Google, through a reconsideration request) result in raised rankings. 99% of the time they just insulate you form things getting worse and give you a very slight results dip One thing you may want to look at is if your disavow was too aggressive. Spam can mean different things to different people. Did you download all links from all tools (Majestic, Ahrefs, Moz, Google Search Console), re-crawl them in SF to see if they are all still live? Did you then fetch metrics (Ahrefs URL Rating, number of linking domains to each URL, PA and DA from Moz, CF and TF from Majestic, Sessions from domain from GA) for all URLs using service APIs and URL Profiler / Netpeak Checker? Did you put all the metrics against each URL in a massive spreadsheet? Did you normalise and boil the metrics down to a final score to see which links have good SEO authority, and which ones don't? If you just looked at them with your human eyes, if you only looked at links from one backlink source without doing a master de-dupe from all backlink data suppliers, then you probably didn't do something anywhere near extensive enough to clean up properly. Cleaning up a negative SEO attack properly with minimal risk, is a huge undertaking that could take an expert around 5 days to compile and update the disavow file Remember you're dealing with an algorithmic devaluation. Algorithms see in numbers, not with your actual eyes. If you didn't have a data-led approach to your work you are extremely likely to continue suffering in the mid to long term I would go over the work again much more forensically Finally: Remember that Google and Moz aren't connected data sources. DA (Domain Authority) is NOT used in Google's ranking algorithm (at all). It's  an estimate only (to replace Toolbar PageRank which Google took away, and stopped SEOs seeing). Even TBPR wasn't great, but it was at least aware of disavows, Google penalties and algorithmic devaluations. Moz's DA is 100% unaware of these things and doesn't factor them, so if you are looking to DA to save you - or for things to 'just get better' over time - think again!

    | effectdigital
    0

  • @Andreas this is the standard WordPress Twenty Nineteen template, with an SEO plugin enabled - there's a menu at the top, but don't know if there's something weird appearing for you. I've done little optimisation for the name, as it's basically the 'brand name' of the site. Wouldn't every website typically mention that on every page? Thanks for your help.

    | james.crowley
    0

  • If the errors are detected by Moz's crawler and Google Search Console (both at the same time) then I'd be much more concerned. It does also depend on the volume of them, if there are like three then it's probably not worth your time to sort it out. If there are hundreds or thousands, you might want to think about that If you have hidden links in the coding which Moz is picking up on (that's how Moz's crawler works, by following links) then you can't really say: "We've checked each page and know that we are not linking to them anywhere on our site" - the fact that the crawler found the links means they exist and are there (even if you can't see them or find them). That is of course, unless your site is on one of the unusual architecture that Rogerbot (Moz's crawler) has difficulties with. That shouldn't be your first assumption, though - he usually knows where he's going Where you say this: "since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up" - pull them up on it! If their developers coded a load of errors into your site, that's their fault not yours and it should be their expense (not yours) to fix it This is the page regarding their CMS: https://www.hubspot.com/products/marketing/content-management-system It does say "A Content Management System Built for Professional Marketers" - so migrating to it, shouldn't cause loads of SEO problems, as SEO is still the largest chunk of most site's online marketing and traffic. That should be nailed down, no problems, fewer problem than your prior system In-fact, HubSpot know that SEO is important for a CMS: https://www.hubspot.com/cms-and-seo - "Every marketer has been told that they need to consider SEO when creating content. But what makes SEO a unique marketing strategy that marketers should prioritize? And why should your CMS have tools that help you execute your SEO strategy?" - I would argue that a load of 404 errors, could not be considered "tools that help you execute your SEO strategy" Whether their developers messed up or their CMS is at fault is not really relevant. The main point is, the responsibility to sort it out should be on their side (not yours, IMO)

    | effectdigital
    0

  • Hi Mazen, thanks for your reply

    | PaddyM556
    0

  • Hi Pat: Thanks for your answer/response! Our URL is www.metro-manhattan.com To answer your questions: 1. When we moved the site to a new theme/database the URL structure did not change. However, about a year before we migrated to a new domain and and went HTTP to HTTPS (secure) so there are redirects. 301 redirects were used. 2. All the on page optimization moved when we changed themes. 3. Yes, there are some unnecessary URLs, but they are managed by robot.txt. 4.  Gzip compression/caching/cdn has been implemented to speed up site. My SEO provider is constantly adjusting but the site is not particularly fast. If we were to custom develop the same functionality without purchasing a theme, would the performance (Google Page Speed, GT Metrix) be much better? If we are satisfied with the structure, design and functionality of our site, is custom coding a costly venture? Thanks,  Alan

    | Kingalan1
    0

  • Yep this is widely known. In-fact, you may be surprised to learn that even those who both manufacture AND distribute their products are also under the kosh here. I work with a number of fashion brands who produce their clothing (which in many cases is really stylish, really high quality stuff) only to see their direct sales (from their own site) hindered by larger 'pure distribution' sites (sites which just distribute loads of items but don't manufacture any) Here's the truth of the matter. It's not that distributors get some kind of unfair special treatment. Rather, it's that distributors aren't bound by the same limitations as manufacturers, or manufacturers with online distribution I love all the brands that I work with, I think their goods are grade A (otherwise I wouldn't work with them). That being said, they can only list their own products on-site. Users commonly search for slightly broader terms which imply that they want to see a range of products to choose from (e.g: "men's shirts", "supportive bras" or "tailormade suits"). The truth is that for a user, the distribution sites ARE better. They can see a huge range of products on these sites and compare them all to find the best deal. This is part of the 'rise of the aggregators' phenomena. Nowadays, unless you're incredibly special - if you don't aggregate, you lose out (end of) There's a reason that giant product aggregation sites (ASOS, eBay, Amazon, Bra-Stop) are killing it. They give people a one stop shop to solve their issue, then continue their lives happy that they have got a good deal. Even the services industry isn't immune to this, just look at sites like Money Supermarket and how dominant they are in Google's SERPs Now you might say, ok - that makes sense for category level terms, but often these guys are beating my client even on product level terms. Why is that? Well - by being such useful sites that make a big difference to user's lives, these sites organically spawn vast amounts of digital PR and UGC link coverage. They're frequently mentioned by the BBC, Forbes... you name it, they're in there (all the time and repeatedly). This builds up a colossal amount of SEO authority, which you basically can't compete with Finally: often these distribution-only sites, because they are so dominant, are a client's main revenue stream (more than their own sites). That's because client's get paranoid that they will offend their distributors and they will then refuse to stock their products. That could be a disaster! But due to this paranoia, they often give distributors better prices for their products, than even their own native sites (if they are manufacturing and distributing directly as well - some do, some don't) Due to this, they lose EVEN MORE product-level listings, as Google knows their listings are the worst deal for users. These patterns of thinking, keep the dominant distributors on top and leave crumbs for the rest of us In the end, their success on Google is really a business  decision and NOT a pure-SEO decision. Do they want to bite the bullet and cut their distributors off from their products, keeping all the Google SERPs for themselves? Well certainly that would cut those guys out completely - but any distributor originating cross-traffic would be lost. What if a user visited the distributor site for another search query (not related to your client's product), then saw your client's product later in their journey and purchased it? All of THAT traffic (which is substantial) is going to be immediately lost. So although cutting out the middle-men might seem smart, it's an incredibly unwise knee-jerk reaction Still, you can never grow above the distributors until you make this choice. It's a choice that WILL hurt you, but in the long run will allow you to reap revenue figures which you never previously dreamed of. So who makes the decision of when the time comes, to go through this necessarry pain? NOT you. Not an SEO person. That responsibility rests with the business owner alone and if I were you, I wouldn't seek to influence it (risky)

    | effectdigital
    0

  • Just so you know, EMA (exact-match anchor text), which is also referred to as 'over' link optimisation, is more a concern for your off-site links. In terms of your internal site structure, that's much more lenient. Obviously if it impacted UX (e.g: site nav buttons with ridiculous amounts of text that become over-chunky, annoying users) then that's bad. If you can satisfy UX and also do some light keyword optimisation of your internal site links, I honestly don't see that as a massive problem. If anything it just gives Google more context and direction I don't think internal link over-optimisation is a myth, because there's always someone stupid enough to pick up a spoon and run with it (taking it to ridiculous extremes that would also impact UX and the readability of the site). But as long as you don't go completely mental and the links make sense for users (they end up where they would expect to end up, with concise link / button text that doesn't bloat the UI) then you're fine. Don't worry about this overly much, but don't take it to an unreasonable extreme

    | effectdigital
    1

  • If you're trying to rank the page for the term "PCB Certification" it certainly makes sense to have that as the page name, it also makes sense for it to be in a folder called PCB with the other PCB related pages. You certainly wouldn't be penalised for that approach. Google will separate out the various parts of the path, so you don't need to worry about it treating it as a single keyword.

    | Xiano
    1

  • Thank you for your reply. I will read what you wrote in more detail.

    | seoanalytics
    1

  • Thank you. Your response is really helpful.

    | katseo1
    0