Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hi Brett, Yes that is my question, thank you! First, I appreciate your response! Second, I had not found anything related to SlideShare slides being crawlable by engines. Do you happen to have a source/link that they are? I understand the ability to no-index the page an embedded SlideShare is on, but if the client wants only a single SlideShare embedded on a page, my guess/assumption is that the page could be viewed as having thin content. My suggestion to them was to have supporting copy/content related to the SlideShare added to the page so it was better optimized. Thoughts?

    | CR-SEO
    0

  • No worries. Likewise, I mostly work with WordPress, so Blogger is definitely not my forte. It hasn't been updated for some time, but something like this plugin may help: https://wordpress.org/plugins/blogger-301-redirect/ Or otherwise, some further digging may be needed Sorry, I can't be of more help. Good luck Ed!

    | Casey_Bryan
    0

  • In that case, you will need to check your configuration, because as I mentioned the behavior of one should not affect the second one. As I remember Moz use Google Analytics to set up your Moz Pro Account, so you should check both configurations (MOZ and GA) to found the error. **Good Luck **

    | Roman-Delcarmen
    0

  • Nikki–if you feel comfortable sharing one of the sites, that would be helpful for further investigation! It seems like there's a lot you're blocking in your robots.txt that you might need (though that's hard to suss out without knowing your site): Disallow: /blogs/+, Disallow: /blogs/%2B, Disallow: /blogs/%2b assuming /blogs is the primary page path for any other blog content (e.g. /blogs/blog-title-goes-here) Disallow: /design_theme_id assuming this is one of your stylesheets, you should probably remove this. Disallow: /gift_cards/* no idea what comes after /gift_cards/[here], but this may be unnecessary There may be other reasons Google doesn't reliably see your site as mobile friendly–if you can provide an example site, then we can dig deeper

    | zeehj
    0

  • Tons of great advice here.  My responsive design drops the sidebar to below 5% of the average scroll depth so I binned it and nothing bad happened to pages per session or time on site or bouncerate or any of the important user signals.  I do have a site-wide 'BOOK AN EMERGENCY APPOINTMENT' button but people use it and we're killing it for emergencies.  It's one of our best catagories and is hotly competitive.  It's in red and replaced a trustpilot widget that was taking people off site. One of the best decisions I ever made and I was tearing my hair out about site wide links messing with SEO but it didn't happen for us.  All the google results were 'no it's SPAMMY' - but they are just a load of content creators jumping on a bandwagon, here in the Moz community, as you can see, things are more nuanced. So if it's relevant and helpful keep it.  Do you have Hotjar.  Allows you to see what peope are clicking on. SWL and footer links Is one of those where spammy sites use it so people say - "ooh don't ever use it" - but you must not be that binary.  If it helps the users then keep it.  In my case hardly anyone used them so I dropped them in favour of one important button. But an additional bump in pageviews and time on site because people are navigating around your site is absolute gold.  So you must encourage that.  but remember Cialdini's jam experiment.  More than three choices is going to induce decisional paralysis especially when you have only a second and the user is almost making unconscious decisions navigating. It's like driving.  You're not thinking what you're doing - it's automatic. So make is fluid and easy and watch the user feedback signals. How about  "Learn more",  "buy something" or 'back to navigation'

    | Smileworks_Liverpool
    0

  • Hi James, Thanks for the input.  Yes, the increased traffic we are getting is quality search traffic from Google on good search terms.  However, it appears they have de-indexed many of our pages that do not seem to be of low quality.  We took great measures to avoid duplicate and low-quality content. Ryan

    | sa_78704
    0

  • Placing a robots.txt in the root of the subdomain will do it: User-agent: * Disallow: /

    | PaulM01
    0

  • Hi Kevin Yes, there's no problem. Check this html rules: H1 - H6 tags https://www.w3schools.com/tags/tag_hn.asp Span tag https://www.w3schools.com/tags/tag_span.asp

    | martinxm
    0

  • I hear ya! You've always got to be a bit suss, unfortunately!

    | Casey_Bryan
    0

  • Hi Andrew, Thanks for jumping in here! Did you see Tomvl's response to your question (below)? Just make sure. Thanks for all your help in the forum! Christy

    | Christy-Correll
    0

  • This tag is automatically added to all the links, both internal and external, that you want to open in a new tab. Although it seems very intrusive, it isactually a security measure to prevent malicious links from taking control of an open tab. It's not advisable to delete the rel = "noopener" of your posts .It doesn't affect the SEO, it does not have any impact on the different analysis tools and neither the affiliate links are affected. The only thing it does is to protect your users from possible malicious links that can hijack your tabs.

    | martinxm
    0

  • You may need to enable canonical headers on your CDN. This post has more information about how to do that. It's geared toward MaxCDN and WPEngine but should give you the general idea.

    | DonnaDuncan
    0

  • Hi Tom, I'm not too sure what the developers have done but the issue is that the UK website doesn't appear to be getting indexed in Google.co.uk but their US website is.  When you visit the site you are prompted to select which country site you would like to visit and I'm not sure if this is causing the issue.  I've also used the fetch as Google tool in search console and it's returning a re-direct message from https://www.pacapod.com/ to https://us.pacapod.com/. Thanks Eddie

    | mypetgiftbox
    0

  • Paul, Thank you! Yes, I realized I could use the Moz bar to highlight the links that were followed and no followed and came to the same conclusion. Thanks for your insight! Dana

    | dklarse
    0

  • Hi Shahjahaaan I am pretty sure I just answered your question on Facebook! If not I apologise but it looks the same If you have 5 countries to target it is very ambitious without having a good SEO holding your hand. But my advice is: 1. Use a single brand name .com for all sites, so for example website.com 2. Each country has a country-specific folder: website.com/en website.com/us wesbite/com/ca 3. Use an hreflang tag on every page pointing to the page itself and the other country versions of the page. https://support.google.com/webmasters/answer/189077?hl=en Google will now not see each of the sites as duplicate even if you post the same content. 4. Folders can be defined to separate categories. website.com/en/accounting website.com/en/london website.com/en/facebook If a job exists in two categories it isn't a problem. - If of course the whole category is duplicated and all jobs the same then you either should not have the separation or you could canonicalise if you really wanted to keep both. https://support.google.com/webmasters/answer/139066?hl=e The fact that the jobs change every day is unimportant as long as the categories are well defined. I hope this helps Regards Nigel

    | Nigel_Carr
    0

  • Has the robots.txt stayed the same? Any js blocked there?

    | KeriMorgret
    0

  • It sounds like you're on the right track. If users and bots start off with the same content, that's a good start. From there, the question is "how much content is being customized, and how frequently?" For example, if you're swapping out 5 different headlines for 40% of users, and 60% of users see the original, that's not a big deal, particularly if the rest of the page is the same. But if you're swapping out 80% of page copy (eg removing a bunch of excess copy that is shown for SEO purposes), and 60-90% of users are seeing that "light" version of the page, you run the risk of two things: First, the chance that it wouldn't pass a manual review if one was performed. Second, the chance that Google may render a copy of the page as a user (not announcing themselves as a crawler), seeing a different version of the page multiple times, and then effectively devaluing the missing content, or worse, flagging the page in their system as cloaked content. We could get lost in details of whether or not they're doing this, or how they're doing this, but from a technology standpoint it's pretty simply for them to render content from non-official IPs and user-agents and do an 'honesty check' for situations where content is showing up multiple ways. This is already how them compare the page on desktop vs mobile to see which sections of the page render, and which are changed. I think you are also right to rely on site interaction before personalizing, but since there are multiple ways to do that, you should know that it's possible for Google to simulate some of those interactions. So there's a chance at some point they will render your content in a personalized manner, particularly if personalization is the result of visiting a URL or clicking a simple toggle switch or button.

    | KaneJamison
    0