Blocking external links in Robots.txt - need advice on Best Practice
-
I look after an affiliate site that is doing quite well in the search rankings. We've been doing a review of our practices and one thing that has cropped up is our robots.txt.
In it, we disallow Google from crawling external links. This used to be best practice in the affiliate industry a couple of years ago, but I wanted to know if this is still the case, and what the implications are if we were to:
a/ leave it as is?
b/ allow crawling?
Thanks in advance.
-
I'm not sure how you're blocking Google from crawling external links in the robots.txt file--typically you only block them from crawling internal pages on your site.
If you're using a script, though, to track the clicks on external links and that script is running your site (and you're blocking that script in robot.txt), then that still should be fine. You may want to add a "nofollow" tag on those links, though, so you don't end up passing link "credit" or "link juice" to those affiliates (unless you want to do that?).
As far as external links go, though, it's typically okay (and expected) that you link out to other sites.