The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. On-Page / Site Optimization
    4. I am trying to better understand solving the duplicate content issues highlighted in your recent crawl report of our site - www.thehomesites.com.

    I am trying to better understand solving the duplicate content issues highlighted in your recent crawl report of our site - www.thehomesites.com.

    On-Page / Site Optimization
    4 3 81
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • urahul
      urahul last edited by

      Below are some of the urls highlighted as having duplicate content - 
      http://www.thehomesites.com/zip_details/76105 
      http://www.thehomesites.com/zip_details/44135 
      http://www.thehomesites.com/zip_details/75227 
      http://www.thehomesites.com/zip_details/94501

      These are neighborhood reports generated for 4 different zip codes. We use a standard template to create these reports. What are some of the steps we can take to avoid these pages being categorized as duplicate content?

      1 Reply Last reply Reply Quote 0
      • hectormainar
        hectormainar last edited by

        Basically every text into your pages is the same, except some small numbers which in proportion represent a really small amount of text, and some meta tags.

        You should make a mix between the standard template and some kind of database information for each neighbourhood. For example, inserting a small description of the area, or visitor comments. If you look for something more automatic than a description, maybe you could query some kind of webservice which could allow you to show the most important streets in the district: that would create some different text without manual work for every page.

        The only way of avoiding that duplicate content is to in fact have different content :(.

        urahul 1 Reply Last reply Reply Quote 1
        • Stramark
          Stramark last edited by

          If you want to rank with unique pages for each zipcode you will have to do what hectormainar said.

          you could make the choice of using a canonical to refer to a page of a larger/more general postal code so you do not create as much duplicate content.

          1 Reply Last reply Reply Quote 0
          • urahul
            urahul @hectormainar last edited by

            Thanks for the suggestions hectormainar.

            1 Reply Last reply Reply Quote 0
            • 1 / 1
            • First post
              Last post
            • Do permanent redirect solve the issue of duplicate content?
              MoosaHemani
              MoosaHemani
              0
              3
              246

            • "Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
              DarinPirkey
              DarinPirkey
              0
              4
              117

            • My report is showing duplicate titles.. for http://www.mysite.com/ & http://mysite.com
              Fridaythe15th
              Fridaythe15th
              0
              8
              163

            • Checking for content duplication against content on your own site.
              CleverPhD
              CleverPhD
              0
              10
              483

            • Help ! My site crawl report tells me I have too much duplicate content
              AlanMosley
              AlanMosley
              0
              2
              316

            • Duplicate page content & title for www.mydomain.com and www.mydomain.com/index.php?
              Carl287
              Carl287
              0
              3
              500

            • Can duplicate content issues be solved with a noindex robot metatag?
              bettingfans.com
              bettingfans.com
              0
              5
              699

            • How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
              seohive-222720
              seohive-222720
              0
              3
              545

            Get started with Moz Pro!

            Unlock the power of advanced SEO tools and data-driven insights.

            Start my free trial
            Products
            • Moz Pro
            • Moz Local
            • Moz API
            • Moz Data
            • STAT
            • Product Updates
            Moz Solutions
            • SMB Solutions
            • Agency Solutions
            • Enterprise Solutions
            • Digital Marketers
            Free SEO Tools
            • Domain Authority Checker
            • Link Explorer
            • Keyword Explorer
            • Competitive Research
            • Brand Authority Checker
            • Local Citation Checker
            • MozBar Extension
            • MozCast
            Resources
            • Blog
            • SEO Learning Center
            • Help Hub
            • Beginner's Guide to SEO
            • How-to Guides
            • Moz Academy
            • API Docs
            About Moz
            • About
            • Team
            • Careers
            • Contact
            Why Moz
            • Case Studies
            • Testimonials
            Get Involved
            • Become an Affiliate
            • MozCon
            • Webinars
            • Practical Marketer Series
            • MozPod
            Connect with us

            Contact the Help team

            Join our newsletter
            Moz logo
            © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
            • Accessibility
            • Terms of Use
            • Privacy