Greetings, ACT-R users,<br><br>I'm new in this group and I am starting to use ACT-R on a research about human error in Brazil (Escola Politécnica, São Paulo University - USP).<br>Some aspect I'd like to know is about the perception and motor modules. Is there any implementation that open possibility to ACT-R interact not only with the listener or the experiment window but with the entire enviroment of the operational system, seeing what is presented in other programs and generatig keyboards' in/out(s) to windows or any program, for example?
<br><br>Thank You.<br><br>Fábio Gutiyama.<br>EPPCS - Departamento de Engenharia da Computação, Escola Politécnica.<br>