preprint available

hertz@nordita.dk hertz at nordita.dk
Mon Jan 28 06:04:05 EST 1991


The following technical report has been placed in the neuroprose archives
at Ohio State University:
		
	Dynamics of Generalization in Linear Perceptrons

		Anders Krogh	    John Hertz
	    Niels Bohr Institut      Nordita

			    Abstract:

We study the evolution of the generalization ability of a simple linear
perceptron with N inputs which learns to imitate a ``teacher perceptron''.
The system is trained on p = \alpha N binary example inputs and the
generalization ability measured by testing for agreement with the teacher on
all 2^N possible binary input patterns.  The dynamics may be solved
analytically and exhibits a phase transition from imperfect to
perfect generalization at \alpha = 1.  Except at this point the
generalization ability approaches its asymptotic value exponentially,
with critical slowing down near the transition; the relaxation time
is \propto (1-\sqrt{\alpha})^{-2}.  Right at the critical point, the
approach to perfect generalization follows a power law \propto t^{-1/2}.
In the presence of noise, the generalization ability is degraded by an
amount \propto (\sqrt{\alpha}-1)^{-1} just above \alpha = 1.

This paper will appear in the NIPS-90 proceedings.  To retrieve it by anonymous ftp, do the following:

unix> ftp cheops.cis.ohio-state.edu          # (or ftp 128.146.8.62)
Name (cheops.cis.ohio-state.edu:): anonymous
Password (cheops.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get krogh.generalization.ps.Z
ftp> quit
unix> uncompress krogh.generalization.ps
unix> lpr -P(your_local_postscript_printer) krogh.generalization.ps


An old-fashioned paper preprint version is also available -- send requests
to
		hertz at nordita.dk
or

John Hertz
Nordita
Blegdamsvej 17
DK-2100 Copenhagen
Denmark


More information about the Connectionists mailing list