Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thanks for your thoughtful reply. Great - that makes sense. I will add en-gb hreflang markup too. You raise some really good points about organisations having to think clearly about the need to undertake multi-regional / multi-lingual SEO and the potential implications of this. In our situation we've come to the conclusion that there is a business case to undertake this venture. When I joined there was already a US office and a few pages written for the US already published on our website in a different design language. Fortunately these pages were recently created and set not to allow crawling. If they were to be indexed at best they may not rank and at worst they may actually interfere with our other page rankings - as well as causing confusion for users (duplicate product / contact / client pages, different navigation structures, designs etc). In the end we decided the best approach would to be to internationalise our website and target these pages to the region / language they were designed for. But yes definitely has been a challenge!

    | SEOCT
    0

  • Hi MrWhippy, Thanks for your reply. I have checked there is no meta noindexing on my site. Search console doesn't have any errors. However, as per your suggestion, I will look into the .htaccess file. Will get back to you If I found anything there. Regards

    | ResultfirstGA
    0

  • It wont affect your SEO, you just don;t need the the non-https version

    | jasongmcmahon
    0

  • Thank you very much for the answer Martijn.

    | bbop33
    0

  • Sure do How to Write Meta Descriptions in a Constantly Changing World

    | Dr-Pete
    2

  • Thank you very much for all of your feedback folks. I really appreciate it. Cloudflare has acknowledged that they are aware of the issue and working to correct it. It also affected Bing Bots. I ended up white listing the IP ranges, and our Google crawl has increased as a result.

    | akin67
    1

  • Hi there, It's not the nicest task in the world, but unfortunately, it's best to manually review websites before you add them to a disavow file. You can use metrics such as Moz spam score to filter them and start with the worst score which may speed up the process a bit. You could also try and prioritise multiple URLs that are from the same domain. For example, if you have 50 URLs linking to you from one domain which looks suspect, you probably only need to review a few of those URLs in order to understand the quality/type of link they are and take action. I'd avoid putting a hard rule in place such as disavowing anything below a certain metric score. You may accidentally disavow links which are perfectly fine. There is also some debate as to the impact that the disavow tool has unless you have a manual or algorithmic penalty in place. If you don't see any evidence of a penalty, my advice would be to only disavow links which are clearly low quality/spammy. Hope that helps! Paddy

    | Paddy_Moogan
    0

  • Hi Richard, Definitely, you shouldn't worry about spammy links. As you keep spotting them, throw that list into the disavow file. There are some comments from an authoritative person in the community: Marie Haynes. Disavowing in 2019 and beyond – the latest info on link auditing Hope it helps. Best luck. Gaston

    | GastonRiera
    0

  • Your question lead me to discover the complete answer for myself, as I'm also always on the path of discovery. I found this whiteboard Friday that also lead to this article about using screaming frog which I found extremely informative.  Hope that helps -Paul

    | Psnowden
    1

  • No problem Matthew, we all have lives away from the internet. I am glad your issue is now resolved. Steve

    | MrWhippy
    0

  • No problem Tom. Thanks for the additional info — that is helpful to know.

    | Nomader
    1

  • Thank you! I think the problem is what you mentioned in the previous post. So MOST of them are indexing but I guess really it is a ranking problem. I have been banging my head against a wall and I cannot figure out why this site isn't ranking, driving me nuts!

    | HashtagHustler
    0

  • Hello There, The display you are talking about is referred to as "site links" and you can read more about them here: https://support.google.com/webmasters/answer/47334?hl=en https://moz.com/learn/seo/serp-features Hope this helps!

    | MiriamEllis
    1

  • Hi Dana, Expires headers and other caching headers can help improve site performance (as you said), and that will be a good thing for SEO. There is no reason to be concerned - they are common headers and there isn't much they could do to have any negative impact on SEO. Good luck! Tom

    | Tom-Anthony
    0

  • Hi, Firstly, yes, that robots.txt is valid and would work for your purpose. There's a great tool (https://technicalseo.com/tools/robots-txt/) that allows you to put in your proposed robots.txt file contents, the URL you want to test and even the robot you want to test against and it lets you know the result.

    | Xiano
    0

  • Yeah you can do language only hreflangs. But it's pure nonsense to direct Google to the very same URL and state that it is the URL for all of those different languages. At the end of the day, Google will crawl from one data centre at once which may be from one of many countries. It will see one version of the page, and assume that 'this is what the page is' If the site structure is that you have one URL only and the contents are modified based on the user's origin, then the structure is wrong as Google will have a very hard time ranking one URL as many different URLs. People who have such a structure always end up here, always argue why it's ok and then end up 'doing it properly' later on as it just doesn't work Also note that, if you have one version of a page served to people in different regions (e.g: an EN page which is stated in the hreflangs to be for both Canadians and Americans), Google may see that as a 'minimum effort' deployment with no value proposition. Different audiences need tailored content to suit them, so a re-write of some of the content is still expected if you want to see an increased international footprint (and you're not a giant like Santander or Coca-Cola) The number of times I see people clone their EN site into a US folder and just 'expect it to rank' with no extra effort, just with hreflangs - is staggering. Google expect to see a value proposition when you build out your site. Value-prop ('value add'), the #1 yet never talked about ranking factor I don't think your current implementation will work very well, if at all. You may have lots of human-brain reasons why it should - but crawlers are robots

    | effectdigital
    0

  • This is a really good answer. You can also look for messages from Google saying that "mobile first indexing" was enabled for your site / GSC property

    | effectdigital
    1