new technical report
jacobs@gluttony.cs.umass.edu
jacobs at gluttony.cs.umass.edu
Wed May 30 10:26:52 EDT 1990
The following technical report is now available:
Task Decomposition Through Competition
in a Modular Connectionist Architecture
Robert A. Jacobs
Department of Computer & Information Science
University of Massachusetts
Amherst, MA 01003
COINS Technical Report 90-44
Abstract
--------
A novel modular connectionist architecture is presented in which
the networks composing the architecture compete to learn the
training patterns. As a result of the competition, different
networks learn different training patterns and, thus, learn to
compute different functions. The architecture performs task
decomposition in the sense that it learns to partition a task into
two or more functionally independent tasks and allocates distinct
networks to learn each task. In addition, the architecture tends
to allocate to each task the network whose topology is most
appropriate to that task, and tends to allocate the same network
to similar tasks and distinct networks to dissimilar tasks.
Furthermore, it can be easily modified so as to learn to perform
a family of tasks by using one network to learn a shared strategy
that is used in all contexts along with other networks that learn
modifications to this strategy that are applied in a context
sensitive manner. These properties are demonstrated by training
the architecture to perform object recognition and spatial
localization from simulated retinal images, and to control a
simulated robot arm to move a variety of payloads, each of a
different mass, along a specified trajectory. Finally, it is
noted that function decomposition is an underconstrained problem
and, thus, different modular architectures may decompose a function
in different ways. A desirable decomposition can be achieved if the
architecture is suitably restricted in the types of functions that
it can compute. Finding appropriate restrictions is possible through
the application of domain knowledge. A strength of the modular
architecture is that its structure is well--suited for incorporating
domain knowledge.
Please note that this technical report is my Ph.D. thesis and, thus, is
considerably longer than the typical technical report (125 pages).
If possible, please obtain a postscript version of this technical report
from the pub/neuroprose directory at cheops.cis.ohio-state.edu.
a) Here are the directions:
unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62)
Name (cheops.cis.ohio-state.edu:): anonymous
Password (cheops.cis.ohio-state.edu:anonymous): neuron
ftp> cd pub/neuroprose
ftp> type binary
ftp> get
(remote-file) jacobs.thesis1.ps.Z
(local-file) foo1.ps.Z
ftp> get
(remote-file) jacobs.thesis2.ps.Z
(local-file) foo2.ps.Z
ftp> get
(remote-file) jacobs.thesis3.ps.Z
(local-file) foo3.ps.Z
ftp> get
(remote-file) jacobs.thesis4.ps.Z
(local-file) foo4.ps.Z
ftp> quit
unix> uncompress foo1.ps.Z
unix> uncompress foo2.ps.Z
unix> uncompress foo3.ps.Z
unix> uncompress foo4.ps.Z
unix> lpr -P(your_local_postscript_printer) foo1.ps
unix> lpr -P(your_local_postscript_printer) foo2.ps
unix> lpr -P(your_local_postscript_printer) foo3.ps
unix> lpr -P(your_local_postscript_printer) foo4.ps
If your printer dies because the size of a file exceeds
the printer's memory capacity, then please try the -s option
to the lpr command (see the manual page for the lpr command).
b) You can also use the Getps script posted on the connectionist
mailing list a few months ago.
If you do not have access to a postscript printer, copies of this
technical report can be obtained by sending requests to Connie Smith
at smith at cs.umass.edu. Remember to ask for COINS Technical Report 90-44.
More information about the Connectionists
mailing list