Should I robots block site directories with primarily duplicate content?
-
Our site, CareerBliss.com, primarily offers unique content in the form of company reviews and exclusive salary information. As a means of driving revenue, we also have a lot of job listings in ouir /jobs/ directory, as well as educational resources (/career-tools/education/) in our. The bulk of this information are feeds, which exist on other websites (duplicate). Does it make sense to go ahead and robots block these portions of our site?
My thinking is in doing so, it will help reallocate our site authority helping the /salary/ and /company-reviews/ pages rank higher, and this is where most of the people are finding our site via search anyways.
ie.
http://www.careerbliss.com/jobs/cisco-systems-jobs-812156/
http://www.careerbliss.com/jobs/jobs-near-you/?l=irvine%2c+ca&landing=true
http://www.careerbliss.com/career-tools/education/education-teaching-category-5/
-
Personally I'm a fan of don't block them via robots. Because if you do and you don't remove the URLs from index (remove the directory, Matt said here ), they still will be indexed.
There was a good posting time ago at seomoz blog and you will see, your pages won't leave the index.
I think, you should set these pages "noindex, follow". It worked good for me.
Patrick