no subject (file transmission)
lautrup@hpthbe1.cern.ch
lautrup at hpthbe1.cern.ch
Tue Sep 5 11:56:30 EDT 1995
FTP-host: connect.nbi.dk
FTP-file: neuroprose/winther.optimal.ps.Z
WWW-host: http://connect.nbi.dk
----------------------------------------------
The following paper is now available:
Optimal Learning in Multilayer Neural Networks [26 pages]
O. Winther, B. Lautrup, and J-B. Zhang
CONNECT, The Niels Bohr Institute, University of Copenhagen, Denmark
Abstract:
The generalization performance of two learning algorithms, Bayes algorithm
and the ``optimal learning'' algorithm on two classification tasks is
studied theoretically. In the first example the task is defined by a
restricted two-layer network, a committee machine, and in the second the
task is defined by the so-called prototype problem. The architecture of the
learning machine is in both cases defined to be a committee machine. For
both tasks the optimal learning algorithm, which is optimal when the
solution is restricted to a specific architecture, performs worse than the
overall optimal Bayes algorithm. However, both algorithms perform far
better than the conventional stochastic Gibbs algorithm, showing that using
prior knowledge about the rule helps to avoid overfitting.
Please do not reply directly to this message.
-----------------------------------------------
FTP-instructions:
unix> ftp connect.nbi.dk (or 130.225.212.30)
ftp> Name: anonymous
ftp> Password: your e-mail address
ftp> cd neuroprose
ftp> binary
ftp> get winther.optimal.ps.Z
ftp> quit
unix> uncompress winther.optimal.ps.ps.Z
-----------------------------------------------
Benny Lautrup,
Computational Neural Network Center (CONNECT)
Niels Bohr Institute
Blegdamsvej 17
2100 Copenhagen
Denmark
Telephone: +45-3532-5200
Direct: +45-3532-5358
Fax: +45-3142-1016
e-mail: lautrup at connect.nbi.dk
More information about the Connectionists
mailing list