The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Technical SEO Issues
    4. Should you use robots.txt for pages within your site which do not have high quality content or are not contributing a great deal so when Google crawls your site the best performing content has a higher chance of being indexed?

    Should you use robots.txt for pages within your site which do not have high quality content or are not contributing a great deal so when Google crawls your site the best performing content has a higher chance of being indexed?

    Technical SEO Issues
    5 4 44
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Jacksons_Fencing
      Jacksons_Fencing last edited by

      I'm really not sure what is best practice for this query?

      1 Reply Last reply Reply Quote 0
      • JordanLowry
        JordanLowry last edited by

        Is it possible to beef up those lower quality pages with better content? If they are important main content pages I would imagine you would want to improve those pages.

        However, if you were going to block them I would recommend a  tag within the header of those pages.

        Hope that helps some.

        1 Reply Last reply Reply Quote 0
        • Alick300
          Alick300 last edited by

          Hi,

          Yes you can block such pages in robots.txt. I would also like to let you know that If you don't want to index some pages you can use .

          I would go for in your case.

          Hope this helps.

          Thanks

          1 Reply Last reply Reply Quote 0
          • johnnybgunn
            johnnybgunn last edited by

            I would definitely not block these pages.  You want to block as few pages as possible.

            1. These pages can be used to boost internal links by linking to your important pages.

            2. Google crawls thousands of pages...it will likely crawl all your important and unimportant files.

            3. You can de-prioritize these page in the XML sitemap, telling the spiders that there are more important pages to crawl.

            4. If these are similar pages, then use the URL parameter tool in Search Console to indicate a page might be a filtered version of a more important page.

            Jacksons_Fencing 1 Reply Last reply Reply Quote 0
            • Jacksons_Fencing
              Jacksons_Fencing @johnnybgunn last edited by

              Thank you for your answer John!

              1 Reply Last reply Reply Quote 0
              • 1 / 1
              • First post
                Last post
              • Google how deal with licensed content when this placed on vendor & client's website too. Will Google penalize the client's site for this ?
                katemorris
                katemorris
                1
                4
                94

              • Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
                Yarden_Uitvaartorganisatie
                Yarden_Uitvaartorganisatie
                0
                5
                1.3k

              • Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
                SEMPassion
                SEMPassion
                0
                4
                390

              • Page disappeared from Google index. Google cache shows page is being redirected.
                shop.nordstrom
                shop.nordstrom
                0
                5
                761

              • Google is indexing blocked content in robots.txt
                bjs2010
                bjs2010
                0
                5
                147

              • Using robots.txt to deal with duplicate content
                Hurf
                Hurf
                0
                6
                1.0k

              • Trying to reduce pages crawled to within 10K limit via robots.txt
                andresgmontero
                andresgmontero
                0
                4
                835

              • Some site pages are removed from Google Index
                randfish
                randfish
                0
                6
                1.0k

              Get started with Moz Pro!

              Unlock the power of advanced SEO tools and data-driven insights.

              Start my free trial
              Products
              • Moz Pro
              • Moz Local
              • Moz API
              • Moz Data
              • STAT
              • Product Updates
              Moz Solutions
              • SMB Solutions
              • Agency Solutions
              • Enterprise Solutions
              • Digital Marketers
              Free SEO Tools
              • Domain Authority Checker
              • Link Explorer
              • Keyword Explorer
              • Competitive Research
              • Brand Authority Checker
              • Local Citation Checker
              • MozBar Extension
              • MozCast
              Resources
              • Blog
              • SEO Learning Center
              • Help Hub
              • Beginner's Guide to SEO
              • How-to Guides
              • Moz Academy
              • API Docs
              About Moz
              • About
              • Team
              • Careers
              • Contact
              Why Moz
              • Case Studies
              • Testimonials
              Get Involved
              • Become an Affiliate
              • MozCon
              • Webinars
              • Practical Marketer Series
              • MozPod
              Connect with us

              Contact the Help team

              Join our newsletter
              Moz logo
              © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
              • Accessibility
              • Terms of Use
              • Privacy