The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. Help with Robots.txt On a Shared Root

    Help with Robots.txt On a Shared Root

    Intermediate & Advanced SEO
    10 2 121
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Whittie
      Whittie last edited by

      Hi,

      I posted a similar question last week asking about subdomains but a couple of complications have arisen.

      Two different websites I am looking after share the same root domain which means that they will have to share the same robots.txt. Does anybody have suggestions to separate the two on the same file without complications? It's a tricky one.

      Thank you in advance.

      1 Reply Last reply Reply Quote 0
      • donford
        donford last edited by

        The subdomain has to be separated from the root in some fashion. I would assume depending on your host that there is a separate folder for the subdomain stuff. Otherwise it would be chaos. Say you installed forums on your forum subdomain and a e-commerce on your shop subdomain... which index.php page would be served?

        There has to be some separation, review your file manager and look for the sub-domain folders. Once found you simply put a robots.txt into each of those folders.

        Hope this helps,

        Don

        Whittie 1 Reply Last reply Reply Quote 0
        • Whittie
          Whittie @donford last edited by

          The developer of the website insists that they have to share the same robots.txt, I am really not sure how he's set it up this way. I am beyond befuddled with this!

          1 Reply Last reply Reply Quote 0
          • donford
            donford last edited by

            What host are you using?

            1 Reply Last reply Reply Quote 0
            • Whittie
              Whittie last edited by

              It's Fasthosts. The developer is certain that we can't use the two separate robots files. The second website has been set up on a 303.

              1 Reply Last reply Reply Quote 0
              • donford
                donford last edited by

                Okay so if you're using a 303 then you're saying the content you want for X site is actually located at Y site.Which means you do not have 2 different sub domains. So there is no need for 2 robots.txt files and your developer is correct you can't use 2 robots.txt files. Since one site would be pointing to the other you only have one sub-domain.

                However, 303 is in general a poor way to use a redirect and likely should be 301.. but I would have to understand why the 303 is being used to say that with 100% certainty. See a quick article about 303 here..

                Hope this answers the question,

                Don

                Whittie 1 Reply Last reply Reply Quote 0
                • Whittie
                  Whittie @donford last edited by

                  Thanks for your help so far.

                  The two different websites are different name domains but share the same root as it's been built this way on Typo3. I don't know of the developer's justification for the 303, it's something I wish we could change.

                  I'm not sure if there are specific tags you can put in the sole robots.txt to differentiate the two, have read a few conflicting arguments about how to do it.

                  1 Reply Last reply Reply Quote 0
                  • donford
                    donford last edited by

                    Can you provide me an example of the way the domains look... Specifically where the root pages are.

                    Domain1.com

                    forums.domain1.com ?

                    Additionally, if you are redirecting 303 one of the domains to the other why do you want two different robots.txt files? The one being 303 will always redirect to the other...?

                    Depending on the structures you can create one robots.txt file that deals with 2 different domains provided there is something unique about the root folders.

                    Whittie 1 Reply Last reply Reply Quote 0
                    • Whittie
                      Whittie @donford last edited by

                      It's not so much that one is a subdomain, it's that they are as different as Google and Yahoo  yet they share the same root. I wish I could show you but I can't because of confidentiality.

                      The 303 wasn't put in place by me, I would have strongly suggested another method. I think it was set up so that both websites could be controlled from the same login but it's opened a can of worms for SEO.

                      I don't want the two separate robots files, the developer insists it has to be that way.

                      1 Reply Last reply Reply Quote 0
                      • donford
                        donford last edited by

                        Okay so if you have one root domain you can only have one robots.txt file.

                        The reason I asked for an example is in the case there was something you could put in the robots.txt to differentiate the two.

                        For example if you have

                        thisdomain.com and thatdomain.com

                        However if "thatdomain.com" uses a folder called shop ("thatdomain.com/shop") than you could prefix all your robots.txt file entries with /shop provided that "thisdomain.com" doesn't use the folder shop, Then all the /shop entries would only be applicable to "thatdomain.com". Does this make sense?

                        Don

                        1 Reply Last reply Reply Quote 0
                        • 1 / 1
                        • First post
                          Last post
                        • Is robots met tag a more reliable than robots.txt at preventing indexing by Google?
                          Bobbi_Tschumper
                          Bobbi_Tschumper
                          1
                          7
                          3.0k

                        • Robots.txt advice
                          Martijn_Scheijbeler
                          Martijn_Scheijbeler
                          0
                          3
                          105

                        • Robots.txt Help
                          GlobeRunner
                          GlobeRunner
                          0
                          5
                          162

                        • Robots.txt Allowed
                          GlobeRunner
                          GlobeRunner
                          0
                          4
                          118

                        • Robots.txt help
                          KeriMorgret
                          KeriMorgret
                          0
                          4
                          85

                        • Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
                          Everett
                          Everett
                          0
                          10
                          2.0k

                        • Robots.txt
                          TomRayner
                          TomRayner
                          0
                          5
                          137

                        • Robot.txt help
                          evolvingSEO
                          evolvingSEO
                          0
                          23
                          203

                        Get started with Moz Pro!

                        Unlock the power of advanced SEO tools and data-driven insights.

                        Start my free trial
                        Products
                        • Moz Pro
                        • Moz Local
                        • Moz API
                        • Moz Data
                        • STAT
                        • Product Updates
                        Moz Solutions
                        • SMB Solutions
                        • Agency Solutions
                        • Enterprise Solutions
                        • Digital Marketers
                        Free SEO Tools
                        • Domain Authority Checker
                        • Link Explorer
                        • Keyword Explorer
                        • Competitive Research
                        • Brand Authority Checker
                        • Local Citation Checker
                        • MozBar Extension
                        • MozCast
                        Resources
                        • Blog
                        • SEO Learning Center
                        • Help Hub
                        • Beginner's Guide to SEO
                        • How-to Guides
                        • Moz Academy
                        • API Docs
                        About Moz
                        • About
                        • Team
                        • Careers
                        • Contact
                        Why Moz
                        • Case Studies
                        • Testimonials
                        Get Involved
                        • Become an Affiliate
                        • MozCon
                        • Webinars
                        • Practical Marketer Series
                        • MozPod
                        Connect with us

                        Contact the Help team

                        Join our newsletter
                        Moz logo
                        © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                        • Accessibility
                        • Terms of Use
                        • Privacy