Search Engine Blocked by Robot Txt warnings for Filter Search result pages--Why?
-
Hi,
We're getting 'Yellow' Search Engine Blocked by Robot Txt warnings for URLS that are in effect product search filter result pages (see link below) on our Magento ecommerce shop. Our Robot txt file to my mind is correctly set up i.e. we would not want Google to index these pages. So why does SeoMoz flag this type of page as a warning? Is there any implication for our ranking? Is there anything we need to do about this? Thanks.
Here is an example url that SEOMOZ thinks that the search engines can't see.
http://www.site.com/audio-books/audio-books-in-english?audiobook_genre=132
Below are the current entries for the robot.txt file.
User-agent: Googlebot
Disallow: /index.php/
Disallow: /?
Disallow: /.js$
Disallow: /.css$
Disallow: /checkout/
Disallow: /tag/
Disallow: /catalogsearch/
Disallow: /review/
Disallow: /app/
Disallow: /downloader/
Disallow: /js/
Disallow: /lib/
Disallow: /media/
Disallow: /.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /skin/
Disallow: /utm
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Sitemap: -
So what your saying is:
1. SEOmoz says these pages can't get indexed by search engines because of our robot.txt
2. We don't want these pages indexed and blocked them using robots.txt
My initial reaction is: no problem, SEOmoz is just showing you as a 'confirmation warning' that these pages are not indexed, but since you did that on purpose, it's okay.
Hope this helps!
-
Like Rick said, it's just a "hey, make sure that you really wanted to do this" type warning, since you can easily write a robots.txt that blocks things you didn't really think would be blocked. Or someone else can modify the robots.txt without telling you, and this can be a warning that you need to go find someone and get that fixed.
-
Thanks Rick for your advice
-
Thanks Keri for your advice