Questions
-
Is it hurting my seo ranking if robots.txt is forbidden?
Yes, excluding certain pages can be a benefit to your rankings: if the excluded pages could be considered duplicate content with your marketing pages or with it each other. This is usually the case for blogs (think wordpress categories) or webshops (pagination, as well as single product pages reachable by different paths (and thus having different urls). As Ryan pointed out: controll that on the page level via noindex,follow to allow PR to flow. Use noindex,nofollow for "internal" pages you dont want to see crawled. I am not sure, but having 9950 pages indexed, but considered duplicate content might hurt rankings for other pages on that domain. Google might consider the Domain spammy. If you need a specific hint for your domain, send me a PM and I have a look if time permits.
Search Engine Trends | | Sebes0