tr announcement: CMU-CS-90-196
Barak.Pearlmutter@F.GP.CS.CMU.EDU
Barak.Pearlmutter at F.GP.CS.CMU.EDU
Tue Jun 6 06:52:25 EDT 2006
*** Please do not forward to other mailing lists or digests. ***
The following 30 page technical report is now available. It can be
FTPed from the neuroprose archives at OSU, under the name
pearlmutter.dynets.ps.Z, as shown below, which is the preferred mode of
acquisition, or can be ordered by sending a note to
School of Computer Science
Carnegie Mellon University
Pittsburgh, PA 15213
USA
along with a check for $2 (domestic) or $5 (outside the USA) to help
defray the expense of reproduction and mailing.
----------------
Dynamic Recurrent Neural Networks
Barak A. Pearlmutter
December 1990
CMU-CS-90-196
(supersedes CMU-CS-88-191)
We a survey learning algorithms for recurrent neural networks
with hidden units and attempt to put the various techniques into
a common framework. We discuss fixpoint learning algorithms,
namely recurrent backpropagation and deterministic Boltzmann
Machines, and non-fixpoint algorithms, namely backpropagation
through time, Elman's history cutoff nets, and Jordan's output
feedback architecture. Forward propagation, an online technique
that uses adjoint equations, is also discussed. In many cases,
the unified presentation leads to generalizations of various
sorts. Some simulations are presented, and at the end, issues
of computational complexity are addressed.
----------------
FTP Instructions:
ftp cheops.cis.ohio-state.edu (or ftp 128.146.8.62)
Name: anonymous
Password: state-your-name-please
ftp> cd pub/neuroprose
ftp> get pearlmutter.dynets.ps.Z
300374 bytes sent in 9.9 seconds (26 Kbytes/s)
ftp> quit
unix> zcat pearlmutter.dynets.ps.Z | lpr
Unlike some files in the archive, the postscript file has been tested
and will print properly on printers without much memory.
More information about the Connectionists
mailing list