New paper: Ffwd Hebbian Learning w/ Nonlinear Outputs
todd@phy.ucsf.edu
todd at phy.ucsf.edu
Mon Aug 1 12:30:59 EDT 1994
FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/troyer.ffwd_hebb.ps.Z
The following paper has been submitted to Neural Networks and is
available from the Ohio State neuroprose archive.
TITLE: Feedforward Hebbian Learning with Nonlinear Output Units:
A Lyapunov Approach
AUTHOR: Todd Troyer <todd at phy.ucsf.edu>
W.M. Keck Center for Integrative Neuroscience
and Department of Pysiology
513 Parnassus Ave. Box 0444
University of California, San Francisco
San Francisco, CA. 94143
ABSTRACT:
A Lyapunov function is constructed for the unsupervised learning
equations of a large class of two layer networks. Units in the output
layer are recurrently connected by fixed symmetric weights; only the
feedforward connections between layers undergo learning. In contrast
to much of the previous work on self-organization of this type, the
output units have nonlinear transfer functions. The Lyapunov function
is similar in form to that derived by Cohen-Grossberg and Hopfield.
Two theorems are proved regarding the location of stable equilibria in
the limit of high gain transfer functions. The analysis is applied to
the soft competitive learning networks of Amari and Takeuchi.
Retrieve this paper by anonymous ftp from:
archive.cis.ohio-state.edu (128.146.8.52)
in the /pub/neuroprose directory
The name of the paper in this archive is:
troyer.ffwd_hebb.ps.Z [15 pages]
No hard copies available.
More information about the Connectionists
mailing list