Tech Report Announcement
Barak.Pearlmutter@F.GP.CS.CMU.EDU
Barak.Pearlmutter at F.GP.CS.CMU.EDU
Tue Jun 6 06:52:25 EDT 2006
The following tech report is available. It is a substantially expanded
version of a paper of the same title that appeared in the proceedings of
the 1988 CMU Connectionist Models Summer School.
Learning State Space Trajectories
in Recurrent Neural Networks
Barak A. Pearlmutter
ABSTRACT
We describe a number of procedures for finding $\partial E/\partial
w_{ij}$ where $E$ is an error functional of the temporal trajectory
of the states of a continuous recurrent network and $w_{ij}$ are the
weights of that network. Computing these quantities allows one to
perform gradient descent in the weights to minimize $E$, so these
procedures form the kernels of connectionist learning algorithms.
Simulations in which networks are taught to move through limit
cycles are shown. We also describe a number of elaborations of the
basic idea, such as mutable time delays and teacher forcing, and
conclude with a complexity analysis. This type of network seems
particularly suited for temporally continuous domains, such as
signal processing, control, and speech.
Overseas copies are sent first class so there is no need to make special
arrangements for rapid delivery. Requests for copies should be sent to
Catherine Copetas
School of Computer Science
Carnegie Mellon University
Pittsburgh, PA 15213
or Copetas at CS.CMU.EDU by computer mail. Ask for CMU-CS-88-191.
More information about the Connectionists
mailing list