Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hi Katherine, I think you can simplify the Schema as you're making items for every URL but, in the end, they're not necessarily separate objects. I'm not sure if that's actually causing the issues, but I do know that you for sure can't list the same URLs in there multiple times. So try this format: Martijn

    | Martijn_Scheijbeler
    0

  • If the GA tracking code is on the website, then it will capture the traffic regardless of whether GA is setup with the http or https version of the site. You might also find this MOZ Question and its answers useful: https://moz.com/community/q/will-changing-the-property-from-http-to-https-in-google-analytics-affect-main-unfiltered-view

    | WebQuest
    0

  • I agree with what effectdigital said. It looks like everything is in place and your non-www and you http versions of the website are redirecting to the https-www version of the site.

    | WebQuest
    0

  • Just to follow this up. We're now seeing the mobile usability error reports gradually being removed from pages at approximagely 100 pages / day. It just seemed to me that the whole validation request process didn't actually appear to do anything and we just had to wait for the site to be recrawled?!

    | DougRoberts
    1

  • And if people don't have Search Console access, they can always try queries like: https://www.google.com/search?q=site%3Abbc.co.uk ... however, due to the difficulty in exporting all the listed URLs, and due to some anti-scraping measures by Google, this data tends to be inferior to that supplied by Search Console. Always use Google's official tools first

    | effectdigital
    0

  • So in July (2019) John Mu from Google stated that URLs are generally ok up to 1,000 characters: https://www.searchenginejournal.com/googles-john-mueller-recommends-keeping-urls-under-1000-characters/318739/ Google 'can' crawl and index URLs over 1,000 characters long (up to about 2k characters) but best practice seems to be up to 1,000 characters Due to this, I personally don't agree with Moz's evaluation of, when a URL is getting too long. Your example URL, is nowhere close to 1,000 characters long. Where Moz and Google disagree I tend to side with Google info That being said, your URL has redundant layers. Why even have "/category/" in the URL? Just go like this: mysite.com/downloads/premium-downloads/sub-category/ People aren't so stupid that they need a fake URL layer called "/category/", to know that the following URL layer 'is' a category. IMO that's redundant architecture which is getting in your way for no reason If you don't perform your redirects properly and you change the architecture of the site, you absolutely could see a negative impact on your rankings. Unless you're confident in terms of crawling your whole site, performing very granular redirects with high accuracy and missing nothing - I'd just leave it as it is

    | effectdigital
    1

  • If you add the rel-tag to the master page then no. But, if you want them to rank for color and size and have a unique section of content let them stand along. If you are not seeing major declines in traffic and you say they are indexed. I would keep it as it. You don't get hit for duplicate products but you will if the pages are gateway Keyword pages.

    | AaronRainsSEO
    0

  • It could happen, but it's unlikely to happen. If Google hasn't discovered that URL at all and then you make it easier for Google to discover, they may then apply internal authority to the page through your internal link structure. That could cause the page to rank higher. But the SEO authority wouldn't come 'from' the XML sitemap, only the 'discovery' of the URL would be impacted by your XML feed. If Google has already known of the page for a while (as is probably the case), then XML tweaks likely won't make any difference

    | effectdigital
    0

  • When you blend different architectures you WILL get contextual tag collisions (e.g: canonical tags don't work the same across both architectures, or Meta robots tags, or hreflangs). What you are suggesting is probably the best thing you can do, I stand behind WordPress quite firmly, even when it's used to augment another CMS (instead of replacing it). That being said, if you believe you are going to be able to predict all the issues which arise in advance - think again! Do your best on that front (obviously) but you need to manage your client's expectations. There is going to be some fallout. There is going to be analysis needed. There is going to be dev work needed to create a seamless integration. It's good that you're being diligent and trying to predict everything, but with the complexities of the modern web (the same thing can be coded in thousands of different ways) you really need to accept there will be some problems that will need fixing. You'll need to hold some budget back for that stuff, or you'll be up a gum tree

    | effectdigital
    0

  • You are correct that you have to delete the duplicate website and set up in the redirects in the htaccess file. Once you have done that I would fetch the home page and other top-level pages to force Google to crawl. This will help with removing the duplicate pages out of Google's index. Also, make sure they don't have a search console set up for the duplicate webiste with a submitted site map.

    | AaronRainsSEO
    0

  • If you're searching for external duplicate content, like if someone has stolen your content: https://www.copyscape.com/ If you're looking for internally duplicated content: You can use SEMRush - https://www.semrush.com/ - their on-site (technical) auditing tool, also commonly comments on internal content duplication 'as' a technical SEO issue. You have to create a project for your / your client's site and then run an on-site crawl, 20,000 URLs limit recommended http://www.siteliner.com/ - Built specifically for checking internally duplicated content on your site ... or do it yourself in Excel. Commonly involves creating a grid of your site's / your client's site's content, then running Boolean string similarity with conditional formatting to generate a heat-map of content duplication. Doing the last one is a little more complex but it can come up with some super cool visuals. Right now, it relies upon the pwrSIMILARITY formula which is not native to Excel (it needs this plugin: https://officepowerups.com/downloads/excel-powerups-premium-suite/). If you don't want to pay for that I'm pretty sure that someone will probably have made a VBA script that does something similar

    | effectdigital
    0

  • For a proper hreflang implementation, the canonical of each page has to point to itself in addition to referencing the other pages that have the same content in a different language. Otherwise, the implementation would be wrong

    | WebQuest
    0

  • Hi William, I think it  mainly depends on whether there is enough search demand to create specialized "living room furniture" pages.  If there is, then go ahead and create those pages and aim for them to rank on their own.  If not, placing the canonical to point to the main page is enough. I wouldn't place a "noindex" there. There's a great article that was posted about Shopify SEO on the MOZ Blog last week and can be found here.  It could be worth checking out if you haven't read it yet.

    | WebQuest
    0

  • I try to structure the meta description to describe what is on the landing page in a way to encourage click-throughs under the max-constraint.  If you are shorter, no biggie! Good luck

    | KevinBudzynski
    1

  • Generally speaking, unless your site has a very high Domain Authority, then it will be very difficult to rank on the first page without building some external backlinks. Internal backlinks alone won't be enough to take your site's pages on the first page of the Google results for the targeted keyword. Also, in regards to internal links, don't forget to cross-link within the article. If one of the pages is referenced on another page, then you can put a link to that. Take the very good example of Wikipedia's internal linking to master it. However, don't overdo the internal links or you might raise some flags. After building more quality external backlinks for your pages, you should be able to see your pages slowly grow in rank and help increase your Domain Authority. Daniel Rika - Dalerio Consulting https://dalerioconsulting.com/ info@dalerioconsulting.com

    | Dalerio-Consulting
    3

  • Hey there, Yeah, I'd go ahead and Implement the canonical tags to the higher-performing website on category and product-level pages. However, if you're feeling a little unsure I'd recommend testing it with a sample set of pages and just monitor the impact to your client's site and adjust accordingly. That's a bummer they won't let you consolidate the sites though.  The other thing I'd probably consider is investing a bit in local SEO since your client sells home good products. Hope that helps.

    | JordanLowry
    0

  • Checking your domain and keyword on Moz Rank Checker, I see that it does rank for the keyword, but it is low in ranks (51+). [image: 7LSIiIiIiIuKalCCMiIiIiIiIsbKsIAwAAAAAAACwniEIAwAAAAAAQKwgCAMAAAAAAECs+P87I7GgIM+iuwAAAABJRU5ErkJggg==] [image: vv+gwBIAAAAASUVORK5CYII=] The keywords are a bit difficult to rank for (22+), so you should work more on building quality links, and it should start ranking. Note that the attached metrics are for the US searches, so different stats will apply for other geo-locations. Daniel Rika - Dalerio Consulting https://dalerioconsulting.com/ info@dalerioconsulting.com Eg5zRzr.jpg DxFuKBt.jpg

    | Dalerio-Consulting
    1

  • Hello Darin. Maybe I did not express myself correctly. I talk about the dot (".") in permalinks (slug) from Wordpress. When I enter "Domain.de Erfahrungen" as post title, Wordpress rewrite the permalink as "myproject.de/artikel/domain-de-erfahrungen". WP replace the dot with an dash. I found already an solution to change that, so that the permalink now looks like "myproject.de/artikel/domain.de-erfahrungen". So, I want to know what's the best strategy for Google. Wordpress has - by design - following permalink: /domain-de-erfahrungen/ Invasion forum just delete the dot: /domainde-erfahrungen/ Trustpilot has following permalinks: /review/www.domain.de/ As my keyword is "domain.de Erfahrungen" - I want to know, what permalink should I use to get best results for Google.

    | cwltd
    0