The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. How to compete with duplicate content in post panda world?

    How to compete with duplicate content in post panda world?

    Intermediate & Advanced SEO
    2 2 1.1k
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • CommercePundit
      CommercePundit last edited by

      I want to fix duplicate content issues over my eCommerce website.

      I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website.

      I want to give one example to know more about it.

      http://www.vistastores.com/outdoor-umbrellas

      Non WWW version:

      http://vistastores.com/outdoor-umbrellas redirect to home page.

      For HTTPS pages:

      https://www.vistastores.com/outdoor-umbrellas

      I have created Robots.txt file for all HTTPS pages as follow.

      https://www.vistastores.com/robots.txt

      And, set Rel=canonical to HTTP page as follow.

      http://www.vistastores.com/outdoor-umbrellas

      Narrow by search:

      My website have narrow by search and contain pages with same Meta info as follow.

      http://www.vistastores.com/outdoor-umbrellas?cat=7

      http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG

      http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum

      I have restricted all dynamic pages by Robots.txt which are generated by narrow by search.

      http://www.vistastores.com/robots.txt

      And, I have set Rel=Canonical to base URL on each dynamic pages.

      Order by pages:

      http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name

      I have restrict all pages with robots.txt and set Rel=Canonical to base URL.

      For pagination pages:

      http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2

      I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages.

      I have also set Rel=Canonical to base URL.

      I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages.

      Google search result:

      https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520

      Since last 7 days, my website have affected with 75% down of impression & CTR.

      I want to recover it and perform better as previous one.

      I have explained my question in long manner because, want to recover my traffic as soon as possible.

      1 Reply Last reply Reply Quote 0
      • KrisRoadruck
        KrisRoadruck last edited by

        Not a complete answer but instead of rel-canonicaling your dynamic pages you may just want to robot.txt block them somthing like:

        Disallow: /*?

        this will prevent google from crawling any version of the page that includes the ? in the URL. Cannonical is a suggetion whereas robots is more of a command.

        as you can see from this query:

        https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520#sclient=psy-ab&hl=en&safe=off&pws=0&gl=US&source=hp&q=site:www.vistastores.com%2Fpatio-living-concepts%3F&pbx=1&oq=site:www.vistastores.com%2Fpatio-living-concepts%3F&aq=f&aqi=&aql=&gs_sm=e&gs_upl=8408l8408l1l8644l1l1l0l0l0l0l65l65l1l1l0&bav=on.2,or.r_gc.r_pw.r_cp.,cf.osb&fp=b03d3d8a434daa&biw=1599&bih=795

        Google has indexed 132 versions of that single page rather than follow your rel=canonical suggestion.

        To further enforce this you may be able to use a fancy bit of php code to detect if the url is dynamic and do a

        robots noindex, noarchive on only the dynamic renderings of the page.

        This could be done like this:

        I also believe there are some filtering tools for this right within webmaster tools. Worth a peek if your site is registered.

        Additionally where you are redirecting non-www subpages to the home page you may instead want to redirect them to their www versions.

        this can be done in htaccess like this:

        Redirect non-www to www: RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^yourdomain.com [NC] RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [L,R=301]

        This will likely provide both a better user experience as well as a better solution in googles eyes.

        I'm sure some other folks will come in with some other great suggestions for you as well 🙂

        1 Reply Last reply Reply Quote 1
        • 1 / 1
        • First post
          Last post
        • Duplicate content. Competing for rank.
          mgreeves
          mgreeves
          0
          3
          78

        • How to find duplicate content, boilerplate content (repeated content) for entire website?
          Alick300
          Alick300
          0
          3
          110

        • Duplicate Content That Isn't Duplicated
          LoganRay
          LoganRay
          0
          5
          140

        • Duplicate Multi-site Content, Duplicate URLs
          MonicaOConnor
          MonicaOConnor
          0
          2
          129

        • Can a website be punished by panda if content scrapers have duplicated content?
          RG_SEO
          RG_SEO
          0
          5
          173

        • Copying my Facebook content to website considered duplicate content?
          SEM-Freak
          SEM-Freak
          0
          9
          3.1k

        • Need help with duplicate content. Same content; different locations.
          sergio_redondo
          sergio_redondo
          0
          2
          339

        • Is this will post Duplicated Content
          Dr-Pete
          Dr-Pete
          0
          3
          387

        Get started with Moz Pro!

        Unlock the power of advanced SEO tools and data-driven insights.

        Start my free trial
        Products
        • Moz Pro
        • Moz Local
        • Moz API
        • Moz Data
        • STAT
        • Product Updates
        Moz Solutions
        • SMB Solutions
        • Agency Solutions
        • Enterprise Solutions
        • Digital Marketers
        Free SEO Tools
        • Domain Authority Checker
        • Link Explorer
        • Keyword Explorer
        • Competitive Research
        • Brand Authority Checker
        • Local Citation Checker
        • MozBar Extension
        • MozCast
        Resources
        • Blog
        • SEO Learning Center
        • Help Hub
        • Beginner's Guide to SEO
        • How-to Guides
        • Moz Academy
        • API Docs
        About Moz
        • About
        • Team
        • Careers
        • Contact
        Why Moz
        • Case Studies
        • Testimonials
        Get Involved
        • Become an Affiliate
        • MozCon
        • Webinars
        • Practical Marketer Series
        • MozPod
        Connect with us

        Contact the Help team

        Join our newsletter
        Moz logo
        © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
        • Accessibility
        • Terms of Use
        • Privacy