Questions
-
Site hit by algorthithmic update in October 2014 - filters and thin content queries.
If the filters are working via parameters you can go into GWT and add them to the exclusion list. Robots.txt might be able to apply if you were able to serve filters via a folder through a bit of URL rewriting. Mostly it's a question of getting Google to ignore the duplicate content so research specific to your clients situation there will be a good start.
Technical SEO Issues | | RyanPurkey0 -
Is this 404 page indexed?
You do often see that if the engine knows about the URL but isn't able to crawl the page. It should eventually drop out. If you have control of the site you can also go into GWT and request removal of that URL from the index, since it's already returning a 404.
Technical SEO Issues | | KeriMorgret0