Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • I'm not sure of their reasoning to be honest Andy. As I understand it, in the code there are two different menus - one for desktop visitors and one for mobile. Obviously only one shows up, depending on what the visitor is using. And rather than present Google with 500+ menu links twice, they've put nofollow on all the mobile version so only one set is presented. Does that makes any sense?

    | abisti2
    1

  • What software system is your website built with, Archie? If it's a Content Management System like WordPress, Drupal or many others, it's entirely likely that there is an SEO plugin that will make implementing canonical tags for all your URLs very easy.

    | ThompsonPaul
    0

  • Hi there, No, haven't tried it yet, but we'll give it a shot. Thanks!

    | rodelmo4
    0

  • Jake, This helps a lot. Thanks for your insights! I think your last advice on rolling it our progressively is the best takeaway. I will let the community know how this works out in the end. Kind regards, Nizar

    | Nizar.
    1

  • Clever PhD hit the nail on the head his answer Is excellent.

    | BlueprintMarketing
    0

  • Hi:  Thanks for your response. I am not sure that has anything to do with it because both people are searching for the company name.  One person, the person sitting in the office where the headquarters is located sees the knowledge panel, while the other person, sitting in another city does not. This happens all the time - where one person sees it and the other person does not - for the same phrase.

    | RosemaryB
    0

  • now i only do releated forums,blogs and coupon code websites. i also build internal links,some website owners will copy my content with the links, and the photos of product without watermark,some website owners will quote it directly on their blog posts or threads. but i think my link types not good enough,anyone have better idea?

    | smokstore
    0

  • Thanks for all the responses! At the moment I am serving the 410's using the .htaaccess file as I removed the actual pages a while ago. The pages don't show in most searches, however, two of them do show up in some instances under the sitelinks which is the main pain. I manually asked for them to be removed using 'remove urls' however that only last a couple of months and they are now back. So I guess the best way is to recreate the pages and insert a noindex? Thanks again for everyone time, it's much appreciated.

    | Jettynz
    0

  • There should be any issue in using multiple methods as long as the data is consistent between them. Cheers, Jake Bohall

    | HiveDigitalInc
    0

  • Hi, Moving sites with more then 10.000 pages can sometimes take up to months. Usually the biggest chunk of pages gets easily redirected and removed from the index but small pages that get less attention in your information architecture are harder to find for Google and so it will take longer for them to remove them. Martijn.

    | Martijn_Scheijbeler
    0

  • I'm always dubious of duplicate site content, automatically generated content is a bad idea. Google will pick up on that like shot. I would guess it's a combination of both duplicate content and automatically generated content. I have yet to see an actual penalty for duplicated on site content, although I guess it could happen.  But what I have seen is that Google will be less favourable to your site if there is a substantial amount of duplicate content.

    | seoman10
    0

  • Oh no!! So sorry to hear that Phil & sorry for the "loan" mix up. There are some great There are some great disavow tools available. Here's a great resource to start: https://moz.com/blog/guide-to-googles-disavow-tool Cleaning up those spammy backlinks will help a lot. Best of luck!

    | BMullz
    0

  • Hi all, I truly believe that Google has started looking at the traffic on pages of your site, determines what the overall traffic is and then divides that number by number of pages to work out a relevance. Therefore if have 10 pages getting 50-100 views each (totaling 800 views / 10 pages = 80) is better than 100 pages getting 20-50 each (totaling 2000 views / 100 pages = 20) Therefore if you remove your un-intresting pages (least visited) I believe ranking will increase. What do you think of this theory? (be kind) Regards

    | danwebman
    0

  • Usually this is can be caused due to case sensitivity or Https and Http mix ups. Ensure that all URLS both naturally on the site and in the feeds are identical, try to ensure that each page has a unique URL on the site and classify this in a Canonical link at the top of each page. Use Screaming Frog to spider the site and look for duplicates as another way of finding the duplicates. Regards

    | danwebman
    0

  • Thank you very much for that, my guys are having a look into both Wistia and also if/how we can defer videos using either Vidyard or YouTube. Thanks again, Matt

    | MattWatts
    1

  • Many thanks Britney

    | abisti2
    0

  • thanks! someone in another thread suggested CrawlMonster - so far "meh" - I prefer MOZ and ScreamingFrog. Anyone suggest any other tools for managing this process?

    | seo_plus
    0

  • HI there, Remember that rankings movements, sometimes don't have much to do with algorithm updates. There are thousands of reason and variables to analyze. About updates you can have very updated info here: mozcast.com Algoroo.com Best luck!

    | GastonRiera
    0

  • Thanks Matt for the detailed answer. I appreciate it.

    | PeterDavies
    0