Questions
-
Duplicate content issue with dynamically generated url
When I looked at your site, changing the criteria changed the listings on the page, so each page was unique. However, I'm guessing 100% of the listings can be accessed by just clicking through the pages of results without changing the criteria? If you decide the best approach is to block the different versions of search results pages, I would consider using the canonical meta tag to specify the canonical (main) version of the page.
On-Page / Site Optimization | | AdamThompson0 -
Duplicate content question with PDF
Having duplicate content within your own site is not as big of a deal as duplicate content from another site. Since you can't use meta tags, you'd have to use robots.txt to keep Google from indexing the PDFs. Nofollowing the links won't necessarily get or keep them out of the index. However, if people are linking to your PDFs, blocking them with robots.txt means you'll lose all link juice pointed to them. Something to consider, at least.
Technical SEO Issues | | AdamThompson0