tech report: benefits of gain
KRUSCHKE,JOHN,PSY
kruschke at ucs.indiana.edu
Fri Dec 7 15:48:00 EST 1990
The following paper is available via ftp from the neuroprose archive
at Ohio State (instructions for retrieval follow the abstract). This
paper was witten more than two years ago, but we believe the ideas are
still interesting even if the details are a bit dated.
Benefits of Gain:
Speeded learning and minimal hidden layers
in back-propagation networks.
John K. Kruschke Javier R. Movellan
Indiana University Carnegie-Mellon University
ABSTRACT
The gain of a node in a connectionist network is a multiplicative
constant that amplifies or attenuates the net input to the node. The
objective of this article is to explore the benefits of adaptive gains
in back propagation networks. First we show that gradient descent
with respect to gain greatly increases learning speed by amplifying
those directions in weight space that are successfully chosen by
gradient descent on weights. Adpative gains also allow normalization
of weight vectors without loss of computational capacity, and we
suggest a simple modification of the learning rule that automatically
achieves weight normalization. Finally, we describe a method for
creating small hidden layers by making hidden node gains compete
according to similarities between nodes, with the goal of improved
generalization performance. Simulations show that this competition
method is more effective than the special case of gain decay.
To get a copy of the paper, do the following:
unix> ftp cheops.cis.ohio-state.edu
login: anonymous
password: neuron
ftp> cd pub/neuroprose
ftp> binary
ftp> get kruschke.gain.ps.Z
ftp> bye
unix> uncompress kruschke.gain.ps.Z
unix> lpr kruschke.gain.ps
More information about the Connectionists
mailing list