Adding non-important folders to disallow in robots.txt file
-
Hi all,
If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty?
Thanks,
Satish
-
Hi,
Yes you can block in robots.txt. You can also use rel="nofollow" if you don't want to pass link juice.
[No Link Juice](https://www.example.com) Hope this helps. Thanks -
This is a great resource for all things robots.txt related: [http://www.robotstxt.org/robotstxt.html](http://www.robotstxt.org/robotstxt.html) -
But as per the current SEO buzz, internal nofollow leads of waste of link juice and we cannot preserve it. Moreover some suggests not to use nofollow internally.