paper available: On Centering Neural Network Weight Updates
Nici Schraudolph
nic at idsia.ch
Thu Aug 7 08:46:29 EDT 1997
Dear colleagues,
the following paper is now available by anonymous ftp from the locations
ftp://ftp.idsia.ch/pub/nic/center.ps.gz and
ftp://ftp.cnl.salk.edu/pub/schraudo/center.ps.gz
On Centering Neural Network Weight Updates
------------------------------------------
by Nicol N. Schraudolph
Technical Report IDSIA-19-97
IDSIA, Lugano 1997
It has long been known that neural networks can learn faster when their
input and hidden unit activity is centered about zero; recently we have
extended this approach to also encompass the centering of error signals
(Schraudolph & Sejnowski, 1996). Here we generalize this notion to all
factors involved in the weight update, leading us to propose centering
the slope of hidden unit activation functions as well. Slope centering
removes the linear component of backpropagated error; this improves credit
assignment in networks with shortcut connections. Benchmark results
show that this can speed up learning significantly without adversely
affecting the trained network's generalization ability.
Best regards,
--
Dr. Nicol N. Schraudolph Tel: +41-91-911-9838
IDSIA Fax: +41-91-911-9839
Corso Elvezia 36
CH-6900 Lugano http://www.idsia.ch/~nic/
Switzerland http://www.cnl.salk.edu/~schraudo/
More information about the Connectionists
mailing list