[ACT-R-users] ACT-R robot
Greg Trafton
greg.trafton at nrl.navy.mil
Mon Jun 23 14:55:13 EDT 2008
We've been hooking up ACT-R to our robots for several years now.
We have a new version (OK, a couple new versions) that basically take
information from the robot's sensors, put them into ACT-R's audicon or
visicon (as appropriate) and go from there. we're calling our
modified version ACT-R/E (for Embodied).
we have several papers available that describe some of the
capabilities we use, most available at http://www.nrl.navy.mil/aic/iss/aas/CognitiveRobots.php
thanks,
greg
On Jun 10, 2008, at 11:37 AM, Bruce J Weimer MD wrote:
> I'm using LispWorks for Windows to program an open-source laptop-on-
> wheels type hobby robot. I'm looking into ACT-R as a possible
> "cognitive engine". My robot, Leaf, has emotion driven behaviors,
> OpenCV face detection/recognition, SAPI5 speech recognition/tts, CU
> Animate face animation, WiFi robot to robot "telepathy", internet
> telepresence, X10 home automation and more. Anyway, I'm intrigued
> by ACT-R's "conflict resolution" scheme. And I'm also interested in
> using WN Lexical to interface WordNet for natural language
> processing. I'm looking then for some help interfacing ACT-R to the
> robot - sending data from the robot to ACT-R regarding people the
> robot's seen, phrases the robot's heard, etc and then sending data
> from ACT-R to the robot to give verbal responses, actuate motor
> sequences, etc.
>
> So, if anyone's interested in helping or can provide links to
> references/papers, please contact me at:
>
> bjweimer at charter.net
>
> Thanks in advance!
>
> Bruce.
>
> _______________________________________________
> ACT-R-users mailing list
> ACT-R-users at act-r.psy.cmu.edu
> http://act-r.psy.cmu.edu/mailman/listinfo/act-r-users
More information about the ACT-R-users
mailing list