The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. Subdomains - duplicate content - robots.txt

    Subdomains - duplicate content - robots.txt

    Intermediate & Advanced SEO
    4 2 890
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • EasyStreet
      EasyStreet last edited by

      Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable.

      Two questions:

      1. Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain?

      2. If question 1 is yes, would it be better for SEO to do that, or leave it how it is?

      1 Reply Last reply Reply Quote 0
      • SeoStallion
        SeoStallion last edited by

        I would personally suggest using the canonical tag to identify the original content. For example place this into the header of the pages with duplicate content:

        This will ensure that the search engines know that it is not the original content and that the page in the link is where the original content is found.

        EasyStreet 1 Reply Last reply Reply Quote 0
        • EasyStreet
          EasyStreet @SeoStallion last edited by

          Thanks SeoStallion.

          That is how we are handling it currently.

          SeoStallion 1 Reply Last reply Reply Quote 0
          • SeoStallion
            SeoStallion @EasyStreet last edited by

            Sorry, god only knows how I missed that.

            Well in that case I think you are doing what is recomended, I generally think of the canonical tag as similar to a 301 redirect. You are telling the search engines that the two pages should be treated as one and then specifying the page that is to be the front-man of the two.

            I think the normal proceedure is to have robot.txt for private/personal information, nofollow and noindex for duplicate content however the canonical tag is an easy solution to duplicate content as it is simply one line in the header.

            1 Reply Last reply Reply Quote 0
            • 1 / 1
            • First post
              Last post
            • How to find duplicate content, boilerplate content (repeated content) for entire website?
              Alick300
              Alick300
              0
              3
              110

            • Block subdomain directory in robots.txt
              DirkC
              DirkC
              0
              5
              1.1k

            • Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
              danwebman
              danwebman
              0
              4
              146

            • Duplicate content on subdomains
              NewspaperArchive
              NewspaperArchive
              0
              3
              192

            • About robots.txt for resolve Duplicate content
              magician
              magician
              0
              4
              449

            • Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
              DuProprio.com
              DuProprio.com
              0
              3
              445

            • Could you use a robots.txt file to disalow a duplicate content page from being crawled?
              KyleChamp
              KyleChamp
              0
              11
              1.3k

            • Block an entire subdomain with robots.txt?
              kylesuss
              kylesuss
              1
              16
              102.1k

            Get started with Moz Pro!

            Unlock the power of advanced SEO tools and data-driven insights.

            Start my free trial
            Products
            • Moz Pro
            • Moz Local
            • Moz API
            • Moz Data
            • STAT
            • Product Updates
            Moz Solutions
            • SMB Solutions
            • Agency Solutions
            • Enterprise Solutions
            • Digital Marketers
            Free SEO Tools
            • Domain Authority Checker
            • Link Explorer
            • Keyword Explorer
            • Competitive Research
            • Brand Authority Checker
            • Local Citation Checker
            • MozBar Extension
            • MozCast
            Resources
            • Blog
            • SEO Learning Center
            • Help Hub
            • Beginner's Guide to SEO
            • How-to Guides
            • Moz Academy
            • API Docs
            About Moz
            • About
            • Team
            • Careers
            • Contact
            Why Moz
            • Case Studies
            • Testimonials
            Get Involved
            • Become an Affiliate
            • MozCon
            • Webinars
            • Practical Marketer Series
            • MozPod
            Connect with us

            Contact the Help team

            Join our newsletter
            Moz logo
            © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
            • Accessibility
            • Terms of Use
            • Privacy