paper available: Learning from Noisy Data...

Michael Biehl biehl at Physik.Uni-Wuerzburg.DE
Thu Apr 27 17:05:57 EDT 1995


FTP-host:       archive.cis.ohio-state.edu
FTP-filename:  /pub/neuroprose/biehl.noisy.ps.Z


The following paper has been placed in the Neuroprose archive
(see above for ftp-host) and is now available  as a compressed
postscript file named

          biehl.noisy.ps.Z     (5 pages of output)

email address:                  biehl at physik.uni-wuerzburg.de

****  Hardcopies  cannot be provided   **** 

------------------------------------------------------------------
 
   "Learning from Noisy Data: An Exactly Solvable Model" 
  
    Michael Biehl, Peter Riegler, and Martin Stechert
    Institut fuer Theoretische Physik
    Julius-Maximilians-Universitaet 
    Am Hubland
  D-97074 Wuerzburg
    Germany  

---------------------------------------------------------------------
                       Abstract:

Exact results are derived for the learning of a linearly
separable rule with a single layer perceptron. We consider two
sources of noise in the training data: the random inversion of
the example outputs and weight noise in the teacher network
respectively. In both scenarios we investigate on-line learning
schemes which utilize only the latest in a sequence of
uncorrelated random examples for an update of the student
weights. We study Hebbian learning as well as on-line algorithms
which achieve an optimal decrease of the generalization error.
The latter realize an asymptotic decay of the gneralization
error that coincides, apart from prefactors, with the one found
for off-line schemes.
----------------------------------------------------------------------




More information about the Connectionists mailing list