Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Jay! Did you ever get your data? Or did these fine folks help?

    | MattRoney
    0

  • THanks for your responses Abhi and Tammy, does anyone have any more feedback or suggestions before i give this a full re-haul?

    | IsaCleanse
    0

  • Hi guys, Thank you so much for your responses. That makes perfect sense. I've forwarded your answers on to our developer. Thanks again, Danny

    | DannyNutcase
    0

  • No responses so I'm guessing no-one really knows what the answer is? FYI my rankings have started to recover since early December, with about half back to where they were and traffic at about 40% of pre-drop figures. Still no real understanding of why it happened or what I did (if anything) to start the recovery process.

    | Gavin.Atkinson
    0

  • You could just redirrect all HTTP links to HTTPS using a .htaccess document https://www.captiga.com/linux/redirect-http-to-https/

    | IsaCleanse
    0

  • Hi Scott, 1. That looks good to me! An additional factor you'd want to consider is how you're treating visits from the subdomains to the main site, and vice versa, in analytics - do you want those to be treated as referrals, or as part of the same session? - and configure accordingly. 2. If you've marked the subdomains "noindex, follow" Google will likely pass some link juice, but as usual in Moz Q&A the answer is "it depends" In this case, it depends on whether or not Google crawls the pages on the subdomain in the first place, and how closely-related Google perceives the subdomains to be to the main domain. So the answer is "some, probably, but probably not as much as links from unrelated sites that aren't noindexed." 3. From your question, it sounds like you're pretty familiar with the subdomains-vs-subfolders conversation in SEO, so I won't go into it here. Again, you're going to want to be really intentional when it comes to tracking on these sites to make sure you're properly tracking traffic between them. This sounds like it could make a really interesting blog post once you've got it all set up!

    | RuthBurrReedy
    0

  • There is a good example of "table of contents" links in this Moz Blog article by John-Henry Scherck. https://moz.com/blog/link-prospectors-into-lead-generators He used bullets and that format is good because his subheadings are long.  Mine are usually short, correspond to keywords and can be separated by pipes. Contents:  Topic 1  |   Topic 2  |  Topic 3  |  Topic 4

    | EGOL
    1

  • Hi Dan, I depends on the model, I would recommend all the obvious stuff like: Branch information like opening times and addresses for individual agents Individual property information like area, price etc Some of the more interesting/new things are actions. You could potentially use: Ask question action: https://schema.org/AskAction Rent Action: https://schema.org/RentAction On the sitemap question, you want to make sure your sitemaps are "clean" meaning there are only pages that return a 200 response. There shouldn't be any redirects or 404s. Only put pages you want to show up in the index in the sitemap. Hope that helps. https://schema.org/RentAction

    | CraigBradford
    0

  • Hi Dmitrii, Thank you! I totally agree. After writing this question out, I knew it would be just as confusing for bots. The request for change has been submitted to my team.

    | lunavista-comm
    0

  • I will paste link to same answer here:  https://moz.com/community/q/text-hidden-by-java So - this text is "less valued" since it's covered with "display: none". It's much better if text is unrolled even if page will become little bit longer.

    | Mobilio
    0

  • Hi Yeah agree with Chris the first option is the best way to go and the way we went after loads of research. Thanks Andy

    | Andy-Halliday
    1

  • To be honest, this type of thing is definitely a weak point in my knowledge but if it were my site, I wouldn't be heading in this direction with it. What you're essentially doing is obscuring duplicate content from search engines but presenting it to users which we know is a no-no. It may well be that search engines can't "see" that duplicate content just yet but that doesn't mean they won't in the next update. More importantly, users aren't particularly engaged by seeing the same block of content over and over so it's kind of a waste of valuable screen real estate. One other question to consider with this scenario: do users actually want to know about this manufacture process? This isn't a leading question. What I'm getting at is that content should always cover what the user wants to know, not what the business wants them to read about. If this process is really just a sidenote for most users, risking content duplication to push it directly in front of them is a large and unnecessary risk. Of course, if the process is a unique selling point that may actually persuade sales and/or build that rapport, disregard this point

    | ChrisAshton
    2

  • Hi Steven, It's very hard to find definitive info about this update since, to my understanding, Google is yet to even confirm that it happened but from what I can gather, it was another revision of onsite quality. Search Engine Land has about the most helpful article I've managed to find on it yet. To save you some legwork, one of Search Engine Land's previous posts on the topic (linked to in the one above) also provides a link to two Google resources on this topic. Note that they're old links more related to Panda but it does talk about what Google views as "quality" which is what this Phantom update is focussed on. More guidance on building high-quality sites Search Console Help: Create Valuable Content There's a bit of reading there but hopefully it's helpful.

    | ChrisAshton
    0

  • Keyword research will be key, and that's really going to be where you want to focus down. Find related terms, long tail, and opportunities you may not have considered in the past. Don't get too held up on the head terms to start - Optimize for long tails (with head terms in them), then once those get traction you'll see progress on the head term. You really need a process/strategy around keyword research. It's not as simple as going to Keyword Planner and picking a few; competitive analysis and proper research is key to the whole process. When I talked about page depth, there is only so many clicks a user will take on a site before they move on. Basically what I meant is to organize the architecture of the site (navigation) so that the user needs as few clicks as possible to reach the products. Fewer clicks (hops) for the user also means fewer hops for Googlebot. Fewer hops between products means more quality pages indexed, more quality pages indexed means a wider spread of keywords to be found on, more keyword rankings = more traffic. Optimize for the customer first, since that's how you make money. Make the site easy to navigate, and you'll see a lot of benefit from that.

    | Eric_Rohrback
    1

  • Cool, thanks for the quick reply!  I see a local KP in both instances (with location set outside of Houston), but they look great either way.

    | brianspatterson
    0

  • You could launch it on the .com.au extension - but I fear that the geotargeting resulting from the ccTLD is a much stronger signal for Google than the ahfref tag. Not sure if this link is still valid but it states: Q: Does “rel alternate hreflang” replace geotargeting? A: No. This link-element provides a connection between individual URLs, and only allows Google to “swap out” the URLs from your site currently shown in the search results with ones that are more relevant to the user. It does not affect ranking, as geotargeting would. Check this video from Matt Cutts (2013) on how Google deals with ccTLD's: https://www.youtube.com/watch?v=yJqZIH_0Ars If your dot.com domain is not available - try if some of these generic tld's is available for your domain - and redirect to the .com as soon as you can. It's not optimal - but I honestly think that the .com.au/us/ solution is not going to work. Dirk

    | DirkC
    0

  • Hi there. No, there is no other way than backlinks. Basically, the way Search Engines work is that they know what your website (domain) is about and than try to find the best page to match the query. One of the strongest signals engines look at is backlinks. So, if your home page has good backlink profile, secondary page doesn't have any, and Google understands that your website is about, let's say, apples, your index page will be ranking no matter how much you optimize content on secondary page. You can try doing solid internal linking with exact matching anchor text, but I doubt it will help much. Just try to get couple good backlinks to secondary page and you'll be fine.

    | DmitriiK
    1

  • Or should I fix the issue first via htaccess rule before attempting the migration I quite honestly think that the problem is WITH htaccess, not that you have to fix something else with htaccess. And as an answer to your question - you always can migrate with issues and hope that nothing breaks during the process, or try to patch it up so it seems to be working fine and, again, hope that it doesn't break on you, OR you can get it fixed at the root of the problem and don't worry about it in the future.

    | DmitriiK
    0

  • Hi. Bad idea. It won't help with rankings. Only will bring confusion to users and, since it's kinda spammy and manipulative, wrath from engine crawlers. Why do you want to redirect specifically? Why not just create a page domain.com/keyword, put related content, make it awesome, build backlinks to that page, if possible with exact or partial anchor text? If you check Search Results, it's far-far-far not always that index page is ranking for every keyword. Hope this helps.

    | DmitriiK
    1