The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. Block an entire subdomain with robots.txt?

    Block an entire subdomain with robots.txt?

    Intermediate & Advanced SEO
    16 5 102.1k
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • AdoptionHelp
      AdoptionHelp last edited by

      They both point to the same location on the server? So there's not a different folder for the subdomain?

      If that's the case then I suggest adding a rule to your htaccess file to 301 the subdomain back to the main domain in exactly the same way people redirect from non-www to www or vice-versa. However, you should ask why the server is configured to have a duplicate subdomain? You might just edit your apache settings to get rid of that subdomain (usually done through a cpanel interface).

      Here is what your htaccess might look like:

      <ifmodule mod_rewrite.c="">RewriteEngine on
        # Redirect non-www to wwww
        RewriteCond %{HTTP_HOST} !^www.mydomain.org [NC]
        RewriteRule ^(.*)$ http://www.mydomain.org/$1 [R=301,L]</ifmodule>

      kylesuss 1 Reply Last reply Reply Quote 2
      • kylesuss
        kylesuss @AdoptionHelp last edited by

        Hey Ryan,

        I wasn't directly involved with the decision to create the subdomain, but I'm told that it is necessary to create in order to bypass certain elements that were affecting the root domain.

        Nevertheless, it is a blog and the users now need to login to the subdomain in order to access the Wordpress backend to bypass those elements. Traffic for the site still goes to the root domain.

        1 Reply Last reply Reply Quote 0
        • john4math
          john4math last edited by

          Placing canonical tags isn't an option?  Detect that the page is being viewed through the subdomain, and if so, write the canonical tag on the page back to the root domain?

          Or, just place a canonical tag on every page pointing back to the root domain (so the subdomain and root domain pages would both have them).  Apparently, it's ok to have a canonical tag on a page pointing to itself.  I haven't tried this, but if Matt Cutts says it's ok...

          kylesuss 1 Reply Last reply Reply Quote 1
          • sprynewmedia
            sprynewmedia last edited by

            Sounds like (from other discussions) you may be stuck requiring a dynamic robot.txt file which detects what domain the bot is on and changes the content accordingly.  This means the server has to run all .txt file as (I presume) PHP.

            Or, you could conditionally rewrite the /robot.txt URL to a new file according to sub-domain

            RewriteEngine on
            RewriteCond %{HTTP_HOST} ^subdomain.website.com$
            RewriteRule ^robotx.txt$ robots-subdomain.txt

            Then add:

            User-agent: *
            Disallow: /

            to the robots-subdomain.txt file

            (untested)

            kylesuss 1 Reply Last reply Reply Quote 2
            • kylesuss
              kylesuss @john4math last edited by

              We have a plugin right now that places canonical tags, but unfortunately, the canonical for the subdomain points to the subdomain. I'll look around to see if I can tweak the settings

              1 Reply Last reply Reply Quote 0
              • kylesuss
                kylesuss @sprynewmedia last edited by

                Thanks for the suggestion. I'll definitely have to do a bit more research into this one to make sure that it doesn't have any negative side effects before implementation

                sprynewmedia kylesuss 6 Replies Last reply Reply Quote 0
                • sprynewmedia
                  sprynewmedia @kylesuss last edited by

                  Option 1 could come with a small performance hit if you have a lot of txt files being used on the server.

                  There shouldn't be any negative side effects to option 2 if the rewrite is clean (IE not accidently a redirect) and the content of the two files are robots compliant.

                  Good luck

                  1 Reply Last reply Reply Quote 2
                  • kylesuss
                    kylesuss @kylesuss last edited by

                    Awesome. We used your second idea and so far it looks like it is working exactly how we want. Thanks for the idea.

                    Will report back to confirm that the subdomain has been de-indexed.

                    1 Reply Last reply Reply Quote 0
                    • sprynewmedia
                      sprynewmedia @kylesuss last edited by

                      You should do a remove request in Google Webmaster Tools.  You have to first verify the sub-domain then request the removal.

                      See this post on why the robots file alone won't work...

                      http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts

                      1 Reply Last reply Reply Quote 1
                      • kylesuss
                        kylesuss @kylesuss last edited by

                        Yeah. As of yet, the site has not been de-indexed. We placed the conditional rule in htaccess and are getting different robots.txt files for the domain and subdomain -- so that works. But I've never done this before so I don't know how long it's supposed to take?

                        I'll try to verify via Webmaster Tools to speed up the process. Thanks

                        1 Reply Last reply Reply Quote 0
                        • sprynewmedia
                          sprynewmedia @kylesuss last edited by

                          Fact is, the robots file alone will never work (the link has a good explanation why - short form: all it does is stop the bots from indexing again).

                          Best to request removal then wait a few days.

                          1 Reply Last reply Reply Quote 3
                          • kylesuss
                            kylesuss @kylesuss last edited by

                            Awesome! That did the trick -- thanks for your help. The site is no longer listed 🙂

                            1 Reply Last reply Reply Quote 1
                            • 1 / 1
                            • First post
                              Last post
                            • Block in robots.txt instead of using canonical?
                              RobertFisher
                              RobertFisher
                              0
                              9
                              1.6k

                            • Robots.txt Blocked Most Site URLs Because of Canonical
                              0
                              1
                              117

                            • Blocking out specific URLs with robots.txt
                              Modi
                              Modi
                              0
                              3
                              133

                            • Files blocked in robot.txt and seo
                              john4math
                              john4math
                              0
                              4
                              344

                            • What content should I block in wodpress with robots.txt?
                              ENSO
                              ENSO
                              0
                              4
                              518

                            • Robots.txt is blocking Wordpress Pages from Googlebot?
                              Desiree-CP
                              Desiree-CP
                              0
                              4
                              10.7k

                            • Blocking Dynamic URLs with Robots.txt
                              TaitLarson
                              TaitLarson
                              1
                              4
                              5.1k

                            • Block all search results (dynamic) in robots.txt?
                              onwebtoday
                              onwebtoday
                              0
                              9
                              4.8k

                            Get started with Moz Pro!

                            Unlock the power of advanced SEO tools and data-driven insights.

                            Start my free trial
                            Products
                            • Moz Pro
                            • Moz Local
                            • Moz API
                            • Moz Data
                            • STAT
                            • Product Updates
                            Moz Solutions
                            • SMB Solutions
                            • Agency Solutions
                            • Enterprise Solutions
                            • Digital Marketers
                            Free SEO Tools
                            • Domain Authority Checker
                            • Link Explorer
                            • Keyword Explorer
                            • Competitive Research
                            • Brand Authority Checker
                            • Local Citation Checker
                            • MozBar Extension
                            • MozCast
                            Resources
                            • Blog
                            • SEO Learning Center
                            • Help Hub
                            • Beginner's Guide to SEO
                            • How-to Guides
                            • Moz Academy
                            • API Docs
                            About Moz
                            • About
                            • Team
                            • Careers
                            • Contact
                            Why Moz
                            • Case Studies
                            • Testimonials
                            Get Involved
                            • Become an Affiliate
                            • MozCon
                            • Webinars
                            • Practical Marketer Series
                            • MozPod
                            Connect with us

                            Contact the Help team

                            Join our newsletter
                            Moz logo
                            © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                            • Accessibility
                            • Terms of Use
                            • Privacy