No subject
solla@homxb.att.com
solla at homxb.att.com
Tue Aug 2 10:41:00 EDT 1988
The following preprint is available. If you want a copy, please
send your request to:
Sara A. Solla
AT&T Bell Laboratories, Rm 4G-336
Crawfords Corner Road
Holmdel, NJ 07733
solla at homxb.att.com
************************************************************************
ACCELERATED LEARNING IN LAYERED NEURAL NETWORKS
Sara A. Solla
AT&T Bell Laboratories, Holmdel NJ 07733
Esther Levin and Michael Fleisher
Technion Israel Institute of Technology, Haifa 32000, Israel
ABSTRACT
Learning in layered neural networks is posed as the minimization of
an error function defined over the training set. A probabilistic
interpretation of the target activities suggests the use of relative
entropy as an error measure. We investigate the merits of using this
error function over the traditional quadratic function for gradient
descent learning. Comparative numerical simulations for the contiguity
problem show marked reductions in learning times. This improvement is
explained in terms of the characteristic roughness of the landscape
defined by the error function in configuration space.
************************************************************************
More information about the Connectionists
mailing list