Hi,
You can use the robot.txt to block unwanted sub domains and files from being indexed.
For example if I have website on a url www.website.com and a sub domain such as sub.webiste.com, in that root file my robot.txt file would read:
User-agent: *
Disallow: /
This will stop the sub domain sub.webiste.com from being indexed.