The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?

    Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?

    Intermediate & Advanced SEO
    3 2 80
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • muriloacct
      muriloacct last edited by

      Hello Everybody!

      After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below)

      The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update.

      I've come up with some solutions, but none of them seems to work:

      1- Add hreflang on the domain:

      We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain  (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries.

      2- Mannualy index all the important pages that were lost

      The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly".

      3- Create a new domain

      If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. )

      I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution

      Any Ideas

      gCi46YE

      1 Reply Last reply Reply Quote 0
      • Nozzle
        Nozzle last edited by

        What's your website?

        1 Reply Last reply Reply Quote 0
        • muriloacct
          muriloacct last edited by

          I can't share that information @Nozzle.

          : /

          I don't think my company allows that!

          1 Reply Last reply Reply Quote 0
          • 1 / 1
          • First post
            Last post
          • Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
            shahryar89
            shahryar89
            2
            2
            53

          • How do we decide which pages to index/de-index? Help for a 250k page site
            julie-getonthemap
            julie-getonthemap
            0
            2
            63

          • How can I optimize pages in an index stack
            VelocityWebsites
            VelocityWebsites
            0
            3
            129

          • How can a Page indexed without crawled?
            Devanur-Rafi
            Devanur-Rafi
            0
            7
            91

          • Can a large fluctuation of links cause traffic loss?
            DougRoberts
            DougRoberts
            0
            13
            154

          • How can Google index a page that it can't crawl completely?
            OlegKorneitchouk
            OlegKorneitchouk
            0
            4
            75

          • Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
            KeriMorgret
            KeriMorgret
            0
            3
            378

          • 1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
            RyanKent
            RyanKent
            0
            7
            606

          Get started with Moz Pro!

          Unlock the power of advanced SEO tools and data-driven insights.

          Start my free trial
          Products
          • Moz Pro
          • Moz Local
          • Moz API
          • Moz Data
          • STAT
          • Product Updates
          Moz Solutions
          • SMB Solutions
          • Agency Solutions
          • Enterprise Solutions
          • Digital Marketers
          Free SEO Tools
          • Domain Authority Checker
          • Link Explorer
          • Keyword Explorer
          • Competitive Research
          • Brand Authority Checker
          • Local Citation Checker
          • MozBar Extension
          • MozCast
          Resources
          • Blog
          • SEO Learning Center
          • Help Hub
          • Beginner's Guide to SEO
          • How-to Guides
          • Moz Academy
          • API Docs
          About Moz
          • About
          • Team
          • Careers
          • Contact
          Why Moz
          • Case Studies
          • Testimonials
          Get Involved
          • Become an Affiliate
          • MozCon
          • Webinars
          • Practical Marketer Series
          • MozPod
          Connect with us

          Contact the Help team

          Join our newsletter
          Moz logo
          © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
          • Accessibility
          • Terms of Use
          • Privacy