The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. Noindexing Duplicate (non-unique) Content

    Noindexing Duplicate (non-unique) Content

    Intermediate & Advanced SEO
    32 4 435
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Philip-DiPatrizio
      Philip-DiPatrizio @khi5 last edited by

      Yes.  It will remove /page-52 and EVERYTHING that exists in /oahu/honolulu/metro/waikiki-condos/.  It will also remove everything that exists in /page-52/ (if anything).  It trickles down as far as the folders in that directory will go.

      **Go to Google search and type this in: **site:honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/

      That will show you everything that's going to be removed from the index.

      khi5 2 Replies Last reply Reply Quote 1
      • khi5
        khi5 @Philip-DiPatrizio last edited by

        thx, Philip. Most helpful. I will get on it

        1 Reply Last reply Reply Quote 0
        • khi5
          khi5 @Philip-DiPatrizio last edited by

          based on Google's own guidelines it appears to be a bad idea to use the removal tool under normal circumstances (which I believe my site falls under): https://support.google.com/webmasters/answer/1269119

          It starts with: "The URL removal tool is intended for pages that urgently need to be removed—for example, if they contain confidential data that was accidentally exposed. Using the tool for other purposes may cause problems for your site."

          Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
          • Philip-DiPatrizio
            Philip-DiPatrizio @khi5 last edited by

            Good find.  I've never seen this part of the help section.  Their resonating reason behind all of the examples seems to be "You don’t need to manually remove URLs; they will drop out naturally over time."

            I have never had an issue, nor have I ever heard of anyone having an issue, removing URLs with the Removal Tool.  I guess if you don't feel safe doing it, you can wait for Google's crawler to catch up, although it could take over a month.  If you're comfortable waiting it out, have no reasons to rush it, AND feel like playing it super safe... you can disregard everything I've said 🙂

            We all learn something new every day!

            khi5 1 Reply Last reply Reply Quote 1
            • khi5
              khi5 @Philip-DiPatrizio last edited by

              lol - good answer Philip. I hear you. What makes it difficult is the lack of crystal clear guidelines from search engines....it is almost like they don't know themselves and each case is sort of on a "what feels right" basis.....

              1 Reply Last reply Reply Quote 0
              • AlanMosley
                AlanMosley last edited by

                Remember that if you no-index pages, any link you have on your site pointing to those pages is wasting its link juice.

                This looks like a job for Canonical tag

                khi5 1 Reply Last reply Reply Quote 1
                • khi5
                  khi5 @AlanMosley last edited by

                  Hi Alan, thx for your comment. Let me give you an example and if you have a though that's be great:

                  1. Condos on Island: http://www.honoluluhi5.com/oahu-condos/
                  2. Condos in City: http://www.honoluluhi5.com/oahu/honolulu-condos/
                  3. Condos in Region: http://www.honoluluhi5.com/oahu/honolulu/metro-condos/

                  Properties on the result page for 3) are all in 2) and all properties within 2) is within 1). Furthermore, for each of those URL, the paginated pages (2 to n) are all different, since each property is different, so using canonical tags would not be accurate. 1 + 2 + 3 are all important keywords.

                  Here is what I am planning: add some unique content to the first page in the series for each of those URL and include just the 1st page in the serious to the index, but pages 2 to n I will keep "noindex, follow" on. Argument could be "your MLS result pages will look too thin and not rank" but the other way of looking at it is "with potentially 500 or more properties on each URL, a bit of stats on page 1 will not offset all the MLS duplicate data, so even though the page may look thin, only indexing page 1 is best way forward".

                  Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
                  • Philip-DiPatrizio
                    Philip-DiPatrizio @khi5 last edited by

                    Sounds like you should actually be using rel=next and rel=prev.

                    More info here: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html

                    khi5 1 Reply Last reply Reply Quote 1
                    • khi5
                      khi5 @Philip-DiPatrizio last edited by

                      Thx ,Philip. I am using already, but I thought adding "noindex, follow" to those paginated pages (on top of rel=next prev") will increase likelihood G will NOT see all those MLS result pages as a bunch of duplicate content. Page 1 may look thin, but with some statistical data I will soon include it is unique and that uniqueness may offset lack of indexed MLS result pages.....not sure if my reasoning is sound. Would be happy to hear if you feel differently

                      1 Reply Last reply Reply Quote 0
                      • AlanMosley
                        AlanMosley last edited by

                        If you no index, I don't think Next Previous will have any affect.

                        If they are different then and if the keywords are all important why no-index?

                        khi5 1 Reply Last reply Reply Quote 0
                        • khi5
                          khi5 @AlanMosley last edited by

                          http://www.honoluluhi5.com/oahu-condos/  - this is an "MLS result page". That URL will soon have some statistics and it will be unique (I will include in index). All the paginated pages (2 to n) hardly has any unique content. It is great layout, users love it (ADWords campaign average user spends 9min and views 16 pages on site), but since it is MLS listings (shared amongst thousands of Realtors) Google will see "ah, these are duplicate pages, nothing unique". That is why I plan to index page 1 (the URL I list) but all paginated pages like: http://www.honoluluhi5.com/oahu-condos/page-2) I will keep as "noindex, follow". Also, I want to rank for this URL: http://www.honoluluhi5.com/oahu/honolulu-condos/ which is a sub-category of the first URL and 100% of the content is exactly the same as the 1st URL. So, I will focus on indexing just the 1st page and not the paginated pages. Unfortunately, G cannot see value in layout and design and I can see how keeping all pages indexed could hurt my site.

                          Would be happy to hear your thoughts on this. I launched site 4 months ago, more unique and quality content than 99% of other firms I am up against, yet nothing happens ranking wise yet. I suspect all these MLS pages is the issue. Time will show!

                          1 Reply Last reply Reply Quote 0
                          • AlanMosley
                            AlanMosley last edited by

                            There is nothing wrong with having duplicate content. It becomes a problem when you have a site that is all or almost all duplicate or thin content.

                            Having a page that is on every other competitors site will not harm you, you just may not rank for it.

                            but no indexing can cause lose of link juice as all links pointing to non indexed pages waste there link juice. Using noindex,follow will return most of this, but still there in no need to no-index

                            khi5 1 Reply Last reply Reply Quote 1
                            • khi5
                              khi5 @AlanMosley last edited by

                              http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls  - that is Rand's whiteboard Friday a few weeks ago and I quote from the transcripts:

                              "So what happens, basically, is you get a page like this. I'm at BMO's Travel Gadgets. It's a great website where I can pick up all sorts of travel supplies and gear. The BMO camera 9000 is an interesting one because the camera's manufacturer requires that all websites which display the camera contain a lot of the same information. They want the manufacturer's description. They have specific photographs that they'd like you to use of the product. They might even have user reviews that come with those.

                              Because of this, a lot of the folks, a lot of the e-commerce sites who post this content find that they're getting trapped in duplicate content filters. Google is not identifying their content as being particularly unique. So they're sort of getting relegated to the back of the index, not ranking particularly well. They may even experience problems like Google Panda, which identifies a lot of this content and says, "Gosh, we've seen this all over the web and thousands of their pages, because they have thousands of products, are all exactly the same as thousands of other websites' other products."

                              1 Reply Last reply Reply Quote 0
                              • AlanMosley
                                AlanMosley last edited by

                                That's correct.

                                you wont rank for duplicate pages, but unless most of your site is duplicate you wont be penalized

                                khi5 1 Reply Last reply Reply Quote 0
                                • khi5
                                  khi5 @AlanMosley last edited by

                                  I am trying to rank for those MLS duplicate alike pages, since that is what users want (they don't want my guide pages with lots of unique data, when they are searching "....for sale"). I will add unique data to page 1 of these MLS result pages. However, page 2-50 will NOT change (stay duplicate alike looking). If I have page 1-50 indexed, the unique content on page 1 may look like a drop in the ocean to G, and that is why I feel including "noindex, follow" on pages 2-50 may make sense.

                                  1 Reply Last reply Reply Quote 0
                                  • AlanMosley
                                    AlanMosley last edited by

                                    Ok if you use follow, that will be ok. but I would be looking at canonical or next previous first

                                    khi5 1 Reply Last reply Reply Quote 0
                                    • khi5
                                      khi5 @AlanMosley last edited by

                                      thx, Alan. I am already using re=next prev. However, that means all those paginated pages will still be indexed. I am adding the "noindex, follow" to page 2-n and only leaving page 1 indexed. Canonical: I don't think that will work. Each page in the series shows different properties, which means pages 1 - n are all different......

                                      1 Reply Last reply Reply Quote 0
                                      • AlanMosley
                                        AlanMosley last edited by

                                        Canonical pages don't have to be the same.

                                        it will merge the content to look like one page.

                                        Good luck

                                        1 Reply Last reply Reply Quote 0
                                        • 1
                                        • 2
                                        • 2 / 2
                                        • First post
                                          Last post
                                        • Shall we add engaging and useful FAQ content in all our pages or rather not because of duplication and reduction of unique content?
                                          0
                                          1
                                          29

                                        • Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
                                          danwebman
                                          danwebman
                                          0
                                          4
                                          146

                                        • Product Page on Eccomerce Site ranking very poorly - Unique Product description but duplicate content on other tabs.
                                          Kingof5
                                          Kingof5
                                          0
                                          6
                                          196

                                        • Noindex Valuable duplicate content?
                                          Rich_Coffman
                                          Rich_Coffman
                                          0
                                          3
                                          53

                                        • Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
                                          McTaggart
                                          McTaggart
                                          0
                                          11
                                          920

                                        • Duplicate Content From Indexing of non- File Extension Page
                                          WebbyNabler
                                          WebbyNabler
                                          0
                                          22
                                          1.3k

                                        • Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
                                          DuProprio.com
                                          DuProprio.com
                                          0
                                          3
                                          445

                                        • Duplicate content that looks unique
                                          Cyrus-Shepard
                                          Cyrus-Shepard
                                          0
                                          3
                                          464

                                        Get started with Moz Pro!

                                        Unlock the power of advanced SEO tools and data-driven insights.

                                        Start my free trial
                                        Products
                                        • Moz Pro
                                        • Moz Local
                                        • Moz API
                                        • Moz Data
                                        • STAT
                                        • Product Updates
                                        Moz Solutions
                                        • SMB Solutions
                                        • Agency Solutions
                                        • Enterprise Solutions
                                        • Digital Marketers
                                        Free SEO Tools
                                        • Domain Authority Checker
                                        • Link Explorer
                                        • Keyword Explorer
                                        • Competitive Research
                                        • Brand Authority Checker
                                        • Local Citation Checker
                                        • MozBar Extension
                                        • MozCast
                                        Resources
                                        • Blog
                                        • SEO Learning Center
                                        • Help Hub
                                        • Beginner's Guide to SEO
                                        • How-to Guides
                                        • Moz Academy
                                        • API Docs
                                        About Moz
                                        • About
                                        • Team
                                        • Careers
                                        • Contact
                                        Why Moz
                                        • Case Studies
                                        • Testimonials
                                        Get Involved
                                        • Become an Affiliate
                                        • MozCon
                                        • Webinars
                                        • Practical Marketer Series
                                        • MozPod
                                        Connect with us

                                        Contact the Help team

                                        Join our newsletter
                                        Moz logo
                                        © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                        • Accessibility
                                        • Terms of Use
                                        • Privacy