robots.txt is a mechanism that tells well behaved bots to not index a site, or which part of the site the bot can index.
Running the log files from the forum trough a relatively modern logfile analyzer will quickly show which are boots and which are humans. Well behaved bots usually identify themselves trough the http-agent parameter.
Having the forum indexed by google and other major search engines is a good thing. Brings more people into the community.