Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey! Did you check if these files might potentially be blocked by your robots.txt or could be on a CDN that they can't reach through the Render and Fetch feature? Martijn.

    | Martijn_Scheijbeler
    0

  • Yeah - there is various speculation about how signals or authority traverse folder structures (see for example this whiteboard Friday ) but I haven't seen anything suggesting it's permanent - all of this may be an argument for adding /famous-dogs/ at some point, but I wouldn't personally stress about it not being there at launch.

    | willcritchlow
    0

  • Thanks James, I really appreciate your thoughts. Would love to get confirmation from anyone else willing to chime in.

    | Ganacontrol1233
    0

  • This is absolutely perfect. Thank you James. I think i'll plan out this strategy better before I go ahead. I won't show the sign up after 1min 30secs, but let the user see the full text of a page from serp as the article. I think I will continue to restrict map marker clicks, after 3 clicks the markers will be unclickable and transparent - this is one of the most valuable parts of my website and I'm not sure how important that is for googlebot and search rankings. I'll also only ask for sign up on only certain pages. Perfect. I really wanted a way to for people to see the content and helpfulness of the website, so this is perfect.

    | thinkLukeSEO
    0

  • That's what I thought.  We haven't seen any negative impacts in search volume or rankings from this.  However our client was concerned.

    | BigChad2
    0

  • Well I do not think there should be much difference as you are talking abut directories in the same website as compared to a Sub-domain like here: cars.website.com flowers.website.com store.website.com When things are in a different directory buy are technically on the same domain they still share in the Domain Authority of the site. The different sub-domain sites could all have different Domain Authority scores from how Google looks at them as far as I know. Best Regards

    | Dalessi
    0

  • Disavow backlinks First, you’ll need to download a list of links to your site. Next, you’ll create a file containing only the links you want to disavow, and upload this to Google. Download links to your site Choose the site you want on the Search Console home page. On the Dashboard, click Search Traffic, and then click Links to Your Site. Under Who links the most, click More. Click Download more sample links. If you click Download latest links, you'll see dates as well. Note: When looking at the links to your site in Search Console, you may want to verify both the www and the non-www version of your domain in your Search Console account. To Google, these are entirely different sites. Upload a list of links to disavow Go to the disavow links tool page. Select your website. Click Disavow links. Click Choose file. Source Search Console Help

    | Roman-Delcarmen
    0

  • Thanks for such a speedy reply! Its such a daunting task as there's literally thousands and thousands of pages so we want to be sure we're doing the right thing. I appreciate your help. Now i'll investigate blocking within the robots.txt and using google search console to remove the URLs

    | Jon.Kennett
    0

  • It's been 301 all the time (at least few years), we have rewrite if doesn't match the domain pattern. That subdomain only shows up in the DNS records.

    | rkdc
    0

  • I have been using backlinko (Brian Dean) as a reference, and as I see it was a mistake. I didn't know Fili Wiese and as I see he knows what is he talking about.....thank for reply

    | Roman-Delcarmen
    0

  • Yes, we have contacted GoDaddy several times. GoDaddy has insisted it is not their problem and they do not have any advice to resolve this issue. GoDaddy support said there can be strange behavior when forward and masking. We tested removing the masking, but it did not make a difference. Nor does 301 vs. 302 redirecting. I understand the latter should not be used as a workaround as these responses have different meanings, but we did test (which also made no difference). Check this link for more details: https://www.godaddy.com/community/Managing-Domains/My-domain-name-not-resolving-correctly-6-random-characters-are/m-p/64440#M16148 Others are experiencing the same issue and somewhere in the thread it was stated that GoDaddy recently rolled out a new system which likely created this issue. We can trace the issue beginning in late August 2017 via Google Analytics, Search Console 404s and testing via Chrome Dev Tools (Network pane with Preserve log checked). We would also like to understand why in order to address the root cause, instead of using a workaround. This is significant issue. Unfortunately, GoDaddy is not handling the issue professionally and will impact our future business decisions involving GoDaddy.

    | SS.Digital
    0

  • Hi Seoanalytics, Is there a correlation or causation between the search popularity of a keyword an the difficulty to rank?  I would think the primal reason for dificulty in ranking is a high comercial value of the Keyword that could be because a lot of people search it or because it has high values + margins.  So not just the popularity of the search term is key and a keyword could be highly popular but have no comercial value. e.g. for "tying shoes" (i know not very exciting example) i find 6.600 monthly searches in US and moz ranking diifficulty 50.  and for "ty slippers" 390 monthly searches moz ranking difficulty 65, whereas you would excpect the other way around if there was causation. so i would say there is correlation but no causability.  Just my 50 cents

    | Moreleads
    1

  • Hi Ruchy, I've answered your questions below: Risks involved: Most obvious is that you're duplicating your pages and therefore you'll need to make sure you correctly indicate that these are AMP versions of your regular content. There is also a risk that because the page loads in the search result, you'll see less traffic directly to your site - but usually (at least for publishers) this is worth it nonetheless because the impressions are higher (due to being featured in the news carousel at the top of the SERP). the best ways to implement it: Depends on your existing setup. If you have a custom CMS, get your dev team to build it into the next iteration of the CMS. If you use Wordpress, you can use a WP plugin - Yoast has a good post about this: https://yoast.com/wordpress-amp-part-ii/ why it is worth it: It may not be. Definitely worthwhile for publishers or sites which publish content for Google News. For other types of site, I would recommend checking whether AMP is a common feature for your primary keywords, and decide accordingly. Also worth it if you're seeing a loss of general traffic and suspect it is due to AMP being more present in search. As James notes, it can help with pagespeed but that's not enough of a reason to do it. Are there any specific questions I should ask a potential developer, or information he should be aware of? Has he/she done this before? Can they provide working examples? Do they understand how to make sure it's showing up as AMP rather than purely duplicate content? Are they familiar with what elements are supported and unsupported? Are they able to also implement relevant structured data markup? is there a way for it it be done for pages that have more than just text like quote forms, sliding headers etc.? Should we only do it for the blog section of our site? You can add things like video, forms etc but bear in mind that the goal is to be as minimalist as possible, so any sort of fancy design element will likely not be supported. I would focus on implementing on news or other types of content which currently display AMP results in the SERP.

    | bridget.randolph
    0

  • In theory, there should be no difference - the canonical header should mean that Google treats the inclusion of /images/123456 as exactly the same as including /images/golden-retriever. It is slightly messier so I think that if it was easy, I'd go down the route of only ever using the /golden-retriever version - but if that's difficult, this is theoretically the same so should be fine.

    | willcritchlow
    1

  • Yeah, I'm with Alan - what works in theory and what works in practice can really vary with the site - if you do both at once and something goes wrong, you won't know which tactic was the problem. Even if it goes right, you won't know which one worked. Keep in mind, too, that 301s and canonicals have different purposes. They may act similarly for search, in some cases, but they're very different for site visitors. A 301-redirect will actually take the visitor to a new page. A canonical will let the duplicate URL be visited but remove it from search results. In most cases, one is more appropriate to your situation than the other.

    | Dr-Pete
    0

  • Here's a good guide from moz on this topic. https://moz.com/learn/seo/search-operators

    | Packaging-Group
    1

  • Hi Camilo, thanks for the clarification.  As this content wil not be available anymore on the "old" pages it will only exist on the newly created blog pages there will be no duplicate content issues.  These newly created blog pages with the content removed from "old" pages will start from scratch for ranking. and the "old" pages could loose some ranking because you reduced the onpage ranking. This is not per se bad as it is part of your strategy to improve on conversions (which should be the most important kpi anyway). You could help these new blog pages a bit by linking to them from the "old" page.

    | Moreleads
    0

  • I would also look at how you are currently managing the sitemaps between the two platforms to make sure that the new pages have not been left out of the sitemap generation.

    | Packaging-Group
    0

  • thanks Stephan. It took nearly a month for search console to display the majority of our pages in sitemap as indexed, even though pages showed up much earler in SERPs. We had it split down into 30 different sitemaps. Later we published also a sitemap index and saw a nice increase a few days later in indexed pages which may have been related. Finally google now is indexing 88% of our sitemap. Do you think in general that 88% is for a site of this size a somehow normal percentage or would you normally expect a higher percentage of indexed sitemap page and investigate deeper for potential pages that google may consider thin content? Navigation I can rule out as a reason.

    | lcourse
    0