> In the home directory's public area (e.g. ~/public_html) of all Web sites,
> add the file robots.txt containing the text:
> User-agent: *
> Disallow: /
Each domain has diffrent directory so search bots looks for robots.txt under
each dir (or maybe I am wrong)
Everyday we add new domains because this is our testing enviroment.
I am looking for more universal trick.