preprint--rigorous learning curve bounds from statistical mechanics

seung@physics.att.com seung at physics.att.com
Mon May 23 11:47:05 EDT 1994


The following preprint is now available:

FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/seung.rigorous.ps.Z 

Authors:  D. Haussler, M. Kearns, H. S. Seung, N. Tishby

Title: Rigorous learning curve bounds from statistical mechanics

Size:  20 pages

Abstract:

In this paper we introduce and investigate a mathematically rigorous
theory of learning curves that is based on ideas from statistical
mechanics.  The advantage of our theory over the well-established
Vapnik-Chervonenkis theory is that our bounds can be considerably
tighter in many cases, and are also more reflective of the true
behavior (functional form) of learning curves.  This behavior can
often exhibit dramatic properties such as phase transitions, as well
as power law asymptotics not explained by the VC theory.  The
disadvantages of our theory are that its application requires
knowledge of the input distribution, and it is limited so far to
finite cardinality function classes.  We illustrate our results with
many concrete examples of learning curve bounds derived from our
theory.



More information about the Connectionists mailing list