preprint: Local and Global Convergence of On-line Learning

seung@physics.att.com seung at physics.att.com
Fri Mar 3 15:01:43 EST 1995


FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/barkai.local.ps.Z

The file barkai.local.ps.Z is now available at
ftp://archive.cis.ohio-state.edu/pub/neuroprose/barkai.local.ps.Z

Local and Global Convergence of On-Line Learning
N. Barkai
Racah Inst. of Physics, Hebrew Univ. of Jerusalem

H. S. Seung
AT&T Bell Laboratories

H. Sompolinsky
Racah Inst. of Physics, Hebrew Univ. of Jerusalem
AT&T Bell Laboratories

We study the performance of an generalized perceptron algorithm for
learning realizable dichotomies, with an error-dependent adaptive
learning rate.  The asymptotic scaling form of the solution to the
associated Markov equations is derived, assuming certain smoothness
conditions.  We show that the system converges to the optimal solution
and the generalization error asymptotically obeys a univeral inverse
power law in the number of examples.  The system is capable of
escaping from local minima, and adapts rapidly to shifts in the target
function.  The general theory is illustrated for the perceptron and
committee machine.




More information about the Connectionists mailing list