I need the ability to exclude directories (parameters, etc.) from within crawl settings of some sort - other scanning programs provide this flexibility.
I realize that one can simply edit the Robots.txt file - but many SAAS website platforms do not allow access to the robots file (if there is one) - or it's limited - meaning the client must pay for customizations to edit the file.
Unfortunately, for many clients, I cannot use the scanning tools at Moz because of this. Does anyone know if there plans to add exclusion options directly within the scanning tools?
Thanks!