The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. Does It Really Matter to Restrict Dynamic URLs by Robots.txt?

    Does It Really Matter to Restrict Dynamic URLs by Robots.txt?

    Intermediate & Advanced SEO
    4 3 902
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • CommercePundit
      CommercePundit last edited by

      Today, I was checking Google webmaster tools and found that, there are 117 dynamic URLs are restrict by Robots.txt. I have added following syntax in my Robots.txt You can get more idea by following excel sheet.

      #Dynamic URLs

      Disallow: /?osCsidDisallow: /?q=

      Disallow: /?dir=Disallow: /?p=

      Disallow: /*?limit=

      Disallow: /*review-form

      I have concern for following kind of pages.

      Shorting by specification:

      http://www.vistastores.com/table-lamps?dir=asc&order=name

      Iterms per page:

      http://www.vistastores.com/table-lamps?dir=asc&limit=60&order=name

      Numbering page of products:

      http://www.vistastores.com/table-lamps?p=2

      Will it create resistance in organic performance of my category pages?

      1 Reply Last reply Reply Quote 0
      • SanketPatel
        SanketPatel last edited by

        Hi,

        Instead of blocking those URLs, You can use "URL parameter" setting in Google webmaster tool. You will get parameters like "?dir" & "?p" in it,  select appropriate option from that like what actually happens when this parameter come into picture.

        1 Reply Last reply Reply Quote 1
        • Cyrus-Shepard
          Cyrus-Shepard last edited by

          Robots.txt isn't the best solution for dynamic URLs. Depending on the type of URL, there are a number of other solutions available.

          1. As blurbpoint mentions, Google Webmaster Tools allows you to specify URL handling. They actually do a decent job of this automatically, but also allow you the option to change the settings yourself.

          http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687

          2. Identical pages with different parameters can create duplicate content, which is often best handled with canonical tags.

          3. Parameters that result in pagination may require slightly nuanced solutions. I won't get into them all here but Adam Audette gives a good overview of pagination solutions here: http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284

          Hope this helps. Best of luck with your SEO!

          CommercePundit 1 Reply Last reply Reply Quote 1
          • CommercePundit
            CommercePundit @Cyrus-Shepard last edited by

            I am quite late to add my reply on this question. Because, I was busy to fix issue regarding dynamic URLs.

            I have made following changes on my website.

            1. I have re-write all dynamic URLs and make it static one exclude session ID and internal search option. Because, I have restricted both version via Robots.txt.
            2. I have set canonical to near duplicate pages which Dr.Pete described in Duplicate content in post panda world.

            I want to give one live example to know more about it.

            Base URL: http://www.vistastores.com/patio-umbrellas

            Dynamic URLs: It was dynamic but, I have re-write to make it static one. But canonical tag to base URL is available on each near duplicate pages which are as follow.

            http://www.vistastores.com/patio-umbrellas/shopby/limit-100
            http://www.vistastores.com/patio-umbrellas/shopby/lift-method-search-manual-lift
            http://www.vistastores.com/patio-umbrellas/shopby/manufacturer-fiberbuilt-umbrellas-llc
            http://www.vistastores.com/patio-umbrellas/shopby/price-2,100
            http://www.vistastores.com/patio-umbrellas/shopby/canopy-fabric-search-sunbrella
            http://www.vistastores.com/patio-umbrellas/shopby/canopy-shape-search-hexagonal
            http://www.vistastores.com/patio-umbrellas/shopby/canopy-size-search-7-ft-to-8-ft
            http://www.vistastores.com/patio-umbrellas/shopby/color-search-blue
            http://www.vistastores.com/patio-umbrellas/shopby/finish-search-black
            http://www.vistastores.com/patio-umbrellas/shopby/p-2
            http://www.vistastores.com/patio-umbrellas/shopby/dir-desc/order-position

            Now, I am looking forward towards Google crawling and How Google treat all canonical pages. I am quite excited to see changes in organic ranking with distribution of page rank in website. Thanks for your insightful reply.

            1 Reply Last reply Reply Quote 0
            • 1 / 1
            • First post
              Last post
            • Block session id URLs with robots.txt
              Mat_C
              Mat_C
              1
              4
              130

            • Best practice for disallowing URLS with Robots.txt
              TimHolmes
              TimHolmes
              0
              3
              650

            • Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
              Martijn_Scheijbeler
              Martijn_Scheijbeler
              0
              11
              1.6k

            • Should I disallow all URL query strings/parameters in Robots.txt?
              OlegKorneitchouk
              OlegKorneitchouk
              0
              5
              11.2k

            • Disallow URLs ENDING with certain values in robots.txt?
              Andy.Drinkwater
              Andy.Drinkwater
              0
              4
              1.9k

            • Massive URL blockage by robots.txt
              CleverPhD
              CleverPhD
              0
              4
              160

            • Blocking out specific URLs with robots.txt
              Modi
              Modi
              0
              3
              133

            • Search Engine Blocked by robots.txt for Dynamic URLs
              KeriMorgret
              KeriMorgret
              0
              2
              689

            Get started with Moz Pro!

            Unlock the power of advanced SEO tools and data-driven insights.

            Start my free trial
            Products
            • Moz Pro
            • Moz Local
            • Moz API
            • Moz Data
            • STAT
            • Product Updates
            Moz Solutions
            • SMB Solutions
            • Agency Solutions
            • Enterprise Solutions
            • Digital Marketers
            Free SEO Tools
            • Domain Authority Checker
            • Link Explorer
            • Keyword Explorer
            • Competitive Research
            • Brand Authority Checker
            • Local Citation Checker
            • MozBar Extension
            • MozCast
            Resources
            • Blog
            • SEO Learning Center
            • Help Hub
            • Beginner's Guide to SEO
            • How-to Guides
            • Moz Academy
            • API Docs
            About Moz
            • About
            • Team
            • Careers
            • Contact
            Why Moz
            • Case Studies
            • Testimonials
            Get Involved
            • Become an Affiliate
            • MozCon
            • Webinars
            • Practical Marketer Series
            • MozPod
            Connect with us

            Contact the Help team

            Join our newsletter
            Moz logo
            © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
            • Accessibility
            • Terms of Use
            • Privacy