Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi there, This is a tremendously complicated question to answer I'm afraid The best thing I could advise you to do is to use Google Search Console and do some fetch and renders as Google. Here are my thoughts: **Because SEO is critical to us, should we try to force SSR after a click ? ** Correct me if I'm wrong, but you basically mean, create additional page? If that's the case, then creating "static" pages would have make a lot of sense if you want to rank for these keywords. **How does the robot opens links ? Is it : open new tab behavior ? Because this would trigger SSR on our side. ** I'd suspect your best bet would be a combination of fetch and render from Search Console and checking your server logs; I don't personally think the "open new tab" behaviour is relevant here as a robot crawls the site after all. Hopefully this starts the discussion so someone else can give you a better answer. Nick

    | NickSamuel
    0

  • Yep when/if Google shut down search operators there are a lot of processes that will be FUBAR! 100% agree a full ecommerce related Technical Audit is a must as a starting point. WQe recently saw a spider trap issue that was preventing a high percentage of  a clients product pages being crawled and indexed. Resolving was a major 'quick win'. Thanks for weighing in Everett. Appreciate it!

    | QubaSEO
    1

  • Hi there, I can't answer all of your questions but Google literally announced we can delete old sitemaps in new search console now: https://www.searchenginejournal.com/google-updates-the-sitemaps-report-in-search-console-adds-ability-to-delete-sitemaps/299495/ With this feature available, there's definitely more opportunities to test a few more sitemap submissions and to verify that all urls have been crawled. If you could cross-reference this with serverlogs you would definitely be on to a winner; although to be fair Googlebot crawling a URL doesn't automatically mean indexation! Good luck, Nick

    | NickSamuel
    0

  • Both of these folks have it spot on and are correct in their recommendations, just remember to keep an eye on Search Console so you can verify the progress!

    | NickSamuel
    0

  • Hi Sam, Even with that random bombshell announcement from Google about rel=next/prev, I still wouldn't worry about this too much. It will be interesting to see if anyone else has a different opinion though! I don't suppose you have explored server logs to validate the claim that Google is being inefficient in its crawling? I don't personally see how having a page 2 indexed equates to this, but understand in general where you are coming from with this! Kind regards, Nick

    | NickSamuel
    0

  • Hey OP, Please could you potentially reword your question to make it a bit simplier to answer? You = BetterRX.com (an app) Brand Competitor = BetterRxCard.com (Prescription card) Where does Good RX come into it, sorry? Cheers, Nick P.S Do BetterRX.com and BetterRXcard.com compete in the same industry? I think this is the case from Googling but just wanted to be sure!

    | NickSamuel
    0

  • Hi there, Have the domain migration issues now resolved themselves? Hopefully it was only temporary! Nick

    | NickSamuel
    0

  • This all sounds good, just make sure before you proceed, you use GA to check what % of your SEO (segment: "Organic") traffic comes from these URLs. Don't act on a hunch, act on data!

    | effectdigital
    0

  • Google considers this to be spam. Sometimes pages get away with doing this, but generally you're going to eventually get a manual action reported in Search Console.

    | MichaelC-15022
    0

  • Are there any downsides?   My answer to that is "I can't think of any". On the other hand... Are there any upsides?  My answer to that is.... Google can be very slow to find - and then begin to use, canonical tag instructions.  On a website that gets a few hundred thousand visits per month, I have seen Google take several weeks to a few months to begin using the canonical tag instructions.  On a website that gets millions of visits per month I have seen Google take a few weeks to a month to follow canonical tag instructions.   In the reverse, it takes them even longer - sometimes several months - to forget canonical  instructions after the tags have been removed. If I was in your situation, I would take a "coming soon" approach with the website.  On the /success-stories/ page I would simply place an announcement in a box that "success stories for nonprofits are coming soon"... then on the /success-stories/nonprofits/ page, I would give an enthusiastic description about "what will be here"... and much of that might be useful for when the full page is finally up. As for the /year/ pages, I would not make them until you have viable content to populate them.  You can make them in a sandbox area, but just not upload or link to them until you have ready-for-visitors content. I find that sometimes a hard part of being a good webmaster is waiting until good content is ready.

    | EGOL
    0

  • Big sites with lots of linking root domains do get beaten in the SERPs by tiny websites with few linking root domains.  What it takes to beat them is great optimization, great content, external links connecting to the competing page, and a site that is highly respected by its visitors. This does not happen a lot.  But tiny sites that punch above their weight class often have the best information for their topic on the web, deeper information for their topic, and authors that have experience and reputation in their field.  They often receive much of their traffic by domain type-ins.  When Google sees people asking for them by name they are rewarded in the SERPs.

    | EGOL
    0

  • I can see cases for using the .edu domain. If you are in the business of educating people at your facility, at their facility, or on a website  then I would definitely start using the .edu domain. If you are a publisher of academic content that is really good content and recommended by professors and used by students then I would use the .edu domain. There are many domains that clearly communicate the business of the organization.   They bring "credibility even if undeserved"... and if you really deserve it then the .edu domain could be like throwing gasoline onto a fire in terms of attracting natural links and pulling clicks in the SERPs. Which domain would you click (or type in) if you were looking for an educational organization named "wilson"?   Which one would you be more inclined to link to? Wilson.com or Wilson.edu There are .edu domains being used by organizations that are in the business of education, but most of their activities would be considered to be something other than students and teaching.  Smithsonian.edu, Getty.edu, GIA.edu. Have you ever sent a link request to a website suggesting that they link to the most valuable page in the internet for a topic and they write back.... Wow!  That's a fantastic article, but we don't link to commercial websites? If you have a website on wilson.com and send a link recommendation to loc.gov or nasa.gov, what are your chances of getting a link?   Does that change if your website is on wilson.edu?

    | EGOL
    1

  • I don't know the answer to your question but can share my personal experience. I have waited as long as 10 months for old, redirected pages to be removed from Google's index.

    | DonnaDuncan
    0

  • Hi Egol: Thanks so much for your response!! In 2018 we focused on UX, now the plan for 2019 is to improve content and build back links. After reading your answer, I am not sure about our SEO's plan for 6 blog posts per month. The topics would highly researched and  pertinent for visitors, answering potential real estate questions. From what you are saying we should focus more on . developing and improving internal content. Assuming the quality of the content is excellent, how many blog posts per month are optimal? Thanks, Alan

    | Kingalan1
    0

  • Hey Jeff, thank you for your input. So you just globally changed the permalink structure, put global redirects in place and you didn't see permanent loss in trafic? And you did that on multiple sites? If so I'll most probably follow your path. Thanks again, Julien

    | julienraby
    0

  • It may potentially affect the rankings on: pages without SSL pages linking to pages without SSL At first, not drastically - but you'll find that you'll get more and more behind until you had wished you just embraced HTTPS. The exception to this of course, is if no one who is competing over the same keywords, is fully embracing SSL. If the majority of the query-space's ranking sites are insecure, even though Google frowns upon that - there's not much they can do (they can't just rank no one!) So you need to do some legwork. See if your competitors suffer from the same issue. If they all do, maybe don't be so concerned at this point. If they're all showing signs of fully moving over to HTTPS, be more worried

    | effectdigital
    1

  • Hi Bhaskaran, Someone with more knowledge than me may be around soon to give you a more in depth answer, but this should get you going. Google has a limited amount of resources to crawl each webpage, so to be efficient it schedules pages to be crawled based on a few different things. The one i suspect is affecting your crawl times is how often your content is updated. If google does not believe that your site will be updated frequently it sees no reason to crawl it that often. If you want google to crawl more often you need to give it a reason to do so, that means ensure your site is easily crawled by bots, good internal linking and make sure your content is fresh and up to date. On a side note, as you are carrying out a domain migration it might be easier to redirect the pages to the new page on the new domain (301 redirect) and ensure your new domain has the points i raised above. I hope this helps?

    | MrWhippy
    0