On-line learning paper
Mauro Copelli da Silva
copelli at onsager.if.usp.br
Tue Nov 14 17:30:34 EST 1995
FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/copelli.equivalence.ps.Z
*** PAPER ANNOUNCEMENT ***
The following paper is available by anonymous ftp from the pub/neuroprose
directory of the archive.cis.ohio-state.edu host (see instructions below).
It is 27 pages long and has been submitted to Physical Review E.
Comments are welcomed.
EQUIVALENCE BETWEEN LEARNING IN PERCEPTRONS WITH
NOISY EXAMPLES AND TREE COMMITTEE MACHINES
Mauro Copelli, Osame Kinouchi and Nestor Caticha
Instituto de Fisica, Universidade de Sao Paulo
CP 66318, 05389-970 Sao Paulo, SP, Brazil
e-mail: copelli,osame,nestor at if.usp.br
Abstract
We study learning from single presentation of examples ({\em incremental}
or {\em on-line} learning) in single-layer perceptrons and tree committee
machines (TCMs). Lower bounds for the perceptron generalization error as
a function of the noise level $\epsilon$ in the teacher output are
calculated. We find that optimal local learning in a TCM with $K$ hidden
units is simply related to optimal learning in a simple perceptron with a
corresponding noise level $\epsilon(K)$. For large number of examples
and finite $K$ the generalization error decays as $\alpha_{cm}^{-1}$,
where $\alpha_{cm}$ is the number of examples per adjustable weight
in the TCM. We also show that on-line learning is possible even in the
$K\rightarrow\infty$ limit, but with the generalization error decaying
as $\alpha_{cm}^{-1/2}$. The simple Hebb rule can also be applied to
the TCM, but now the error decays as $\alpha_{cm}^{-1/2}$ for finite $K$
and $\alpha_{cm}^{-1/4}$ for $K\rightarrow\infty$. Exponential decay of
the generalization error in both the perceptron learning from noisy
examples and in the TCM is obtained by using the learning by queries
strategy.
****************** How to obtain a copy *************************
unix> ftp archive.cis.ohio-state.edu
User: anonymous
Password: (type your e-mail address)
ftp> cd pub/neuroprose
ftp> binary
ftp> get copelli.equivalence.ps.Z
ftp> quit
unix> uncompress copelli.equivalence.ps.Z
unix> lpr copelli.equivalence.ps (or however you print PostScript files)
**PLEASE DO NOT REPLY DIRECTLY TO THIS MESSAGE**
More information about the Connectionists
mailing list