Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Cheers guys, problem sorted.

    | StevePHolmes
    0

  • Set up Canonical tags and use Google's UTM codes to track your non-internal links (as your internal links will be tracked just fine by Google Analytics without any need for using redirects). https://support.google.com/analytics/answer/1033867?hl=en https://moz.com/learn/seo/canonicalization

    | Tenlo
    0

  • Hi Wozniak65 It's a very strange and outdated approach. If they really want to go down the route of promoting each store individually then far better to build one central website and have a page for each store location. They could then use local citations for each of the locations that the store is in. It's not so bad for the SEO them being on the same server it is the fact that most of the content across the 7- websites will be duplicate content with only the location different. This would effectively kill each of the sites from an SEO point of view before anything else. Considerations: 1. Acres of duplicate content 2. The immense amount of work involved to build 70 websites 3. The immense amount of work required to keep all of those sites up to date. 4. The central company information will be the same for all 70 stores presumably. 5. The ip address of the server will be the same for all 70 stores. It's like having 70 micro-sites for the same company.  For me it's  huge no no Google Panda released in 2011 directly targeted skinny sites with low value content in favour of well written, well constructed content rich and engaging sites. I honestly think you should press them to go back to the drawing board in favour of a one site/multi-location solution. Regards Nigel - Carousel Projects

    | Nigel_Carr
    0

  • Just focus on creating shareable content and getting links from authority sites in your niche.

    | Jamesmaina250
    0

  • As you said, as long as you're adding value then there's no harm in seeking links. As always just make sure they're relevant to your space. Link building is still a ranking factor so if you're getting a ranking increase on top of clicks and awareness why not actively seek links?

    | mostcg
    0

  • Hi there, Google tries to select the search results that are most likely to satisfy the search queries - I.e. they provide answers, resolve issues, etc. in comparison with results that will be diverse but likely to result in bounce rates. Therefore, it is irrelevant that multiple results come from the same domain. Any brands search can be a quick example that will just confirm this claim. I hope it helps. Katarina

    | Katarina-Borovska
    0

  • Hi, I think it depends on the site, but my initial thoughts are that there probably isn't a lot of value to having those pages at all. Imagine the site grew to 50X the size it is now, would it make sense to have those pages? By that point, the site would have more thin/no indexed pages than new useful ones. I think it's important to think from a user point of view. Just because you add noindex doesn't mean Google will take the page out of the index in a hurry. That means there is a lot of potential for people to still find you in Google, arrive on an expired ad, have a bad experience and leave. What about trying to do something a bit more useful like redirecting them to a closely matched product category above? Even doing that might be confusing to users unless you have an overlay explaining that the product is no longer available, so you are redirecting them to a different page. Craig

    | CraigBradford
    0

  • Hi, I don't see a redirect either. I tried fetching from the UK and the USA and neither redirected me. Are you still having this problem? Craig

    | CraigBradford
    0

  • We add appropriate internal links in all of our articles.  Although we don't syndicate or grant permission for our articles to be republished, many webmasters copy/pasting quotes into their content have produced thousands of backlinks.

    | EGOL
    0

  • Hi, SemRush will analyse your full page that is optimised for that keyword and scan Google results that compete when it comes to that keyword. It will suggest how to optimise your content and list all words that are desirable in the copy. It will also suggest where to get backlinks from. If your want to manually generate semantically related words, there is a number of tools, for example semantic-link.com - it will be a manual process of selection for you as there might be many generated and only a few will suit you. Hope this helps a bit. Katarina

    | Katarina-Borovska
    0

  • Hi Jeff, If the content on website 1 and 2 is the same and you do a page-by-page mapping and canonicalization, then you'll be fine. Otherwise, I wouldn't do it. Here are Google Guidelines for using canonicals FYI.

    | DonnaDuncan
    0

  • Hi. If you are getting duplicate content warnings between the Mandarin version of your website and a website in a different language, the first thing I'd suggest looking into is to see if your code correctly uses the alternate/hreflang attributes to specify the different versions of the pages in different languages. This tag is intended for sites that have translated content from one language to another and gives you a way to tell Google that the Mandarin version of the page is at this URL, the Spanish version is this URL, the English site is at this other URL, and so on. In terms of title, descriptions, and all other content, you do want to make that specific to the language of the page. So, your Mandarin website would have titles, descriptions, etc. written in Mandarin and your English website would have titles, descriptions, etc. in English, and so on. After all, your Mandarin website is intended for people speaking (and searching in) Mandarin so using that language throughout will increase your chances of ranking for that audience.

    | Matthew_Edgar
    0

  • Hi, Katarina! Thanks for this very thorough response - I'm beginning to see a light at the end of the tunnel. When you say stress the address via directories, you are referring to making sure my external listings and directories are current, consistent, correct, yes? Just confirming you are not recommending something internal to the site? We are writing out driving directions where possible, and using the google maps api to display the location. Also, we won't have unique images for the products - I might be able to do something to edit them differently, but they are the same thing. Will naming them uniquely matter? For the rest, we are writing, writing, writing! The client had no idea their former developer (yup, they paid someone to do this to them) had done a bad thing, and when I first read their GA and MOZ data (before we really dove into the content on each page and realized it had literally been pasted from one site to the other), I thought the data had to be wrong, ha! We're pursuing the suggestion about unique content, and think we have a way way to enough of it to matter. Thanks for taking the time to answer. I will try to post some before and after scores when we are done.

    | kc_sunshines
    0

  • Hi Andreas, Your advice is absolutely golden. Exactly what we thought! We're going to make sure to address this asap. Thanks!

    | andy.bigbangthemes
    0

  • No problem. And there are ways around presenting the same content differently. It's hard to be specific without being able to know what the product is, but it's something I've had to solve for various clients, whether it's been for software or actual products.

    | badgergravling
    0

  • As Matt said, you need to be redirecting all your http urls to https. If you have WordPress or a similar CMS, you can usually do this in one go via the .htaccess file. Just enter the right code to redirect any http url to https and you'll be done.

    | badgergravling
    0

  • Hi, So the main issue with dynamic search results is that they won't have unique content, and will quite often duplicate other pages, as you've discovered. E.g. Products under £10, Products under £20, and Products under £30 would all include content from the first, and then the first and second categories. The usual answer is to just noindex all of them, particularly if individual product pages are your focus and ranking well. Unless you specifically want to rank for 'x products under £10' then there's no issue with removing them from search results. You have a couple of options for doing this - either by noindexing any dynamic search content via robots.txt, or Noindex,Follow, which still allows crawling, but won't display the results. In general, I'd say that Noindexing the dynamic search results with a wildcard is easiest and most effective. If you did have something you wanted to rank - then it's be a case of setting a specific search result with some unique content describing that category etc. Dan

    | badgergravling
    0

  • Hey there, Since you are basically disabling the subdomains, there won't be any future data in GCS for them anymore. I'd recommend to set up separate folders in GSC in the same way as it used to be for the subdomains. Then, you will be able to track each folder separately like before. Cheers, Martin

    | benesmartin
    0