paper available
Michael Biehl
biehl at connect.nbi.dk
Fri Jul 15 17:56:26 EDT 1994
FTP-host: archive.cis.ohio-state.edu
FTP-file: pub/neuroprose/biehl.online-perceptron.ps.Z
The following paper has been placed in the Neuroprose archive in file
biehl.online-perceptron.ps.Z (8 pages). Hardcopies are not available.
-------------------------------------------------------------------------
ON-LINE LEARNING WITH A PERCEPTRON
Michael Biehl
CONNECT, The Niels Bohr Institute
Blegdamsvej 17, 2100 Copenhagen, Denmark
email: biehl at physik.uni-wuerzburg.de
and
Peter Riegler
Institut fuer theoretische Physik
Julius-Maximilians-Universitaet Wuerzburg
Am Hubland, D-97074 Wuerzburg, Germany
submitted to Europhysics Letters
ABSTRACT
We study on-line learning of a linearly separable rule with a simple
perceptron. Training utilizes a sequence of uncorrelated, randomly drawn
N-dimensional input examples. In the thermodynamic limit the generalization
error after training with P such examples can be calculated exactly. For
the standard perceptron algorithm it decreases like (N/P)^(1/3) for large
(P/N), in contrast to the faster (N/P)^(1/2)-behavior of the so-called
Hebbian learning. Furthermore, we show that a specific parameter-free on-
line scheme, the AdaTron-algorithm, gives an asymptotic (N/P)-decay of the
generalization error. This coincides (up to a constant factor) with the bound
for any training process based on random examples, including off-line
learning. Simulations confirm our results.
-----------------------------------------------------------------------
--- Michael Biehl biehl at physik.uni-wuerzburg.de
More information about the Connectionists
mailing list