TR Announcemnent: Effective VC Dimension

Leon Bottou leon at bop.neuristique.fr
Fri Jun 17 03:25:12 EDT 1994


**DO NOT FORWARD TO OTHER GROUPS**


TR Announcemnent: Effective VC Dimension
----------------------------------------
FTP-host: archive.cis.ohio-state.edu
FTP-file: pub/neuroprose/bottou.effvc.ps.Z

The file bottou.effvc.ps.Z is now available for
copying from the Neuroprose repository:


TITLE:

On the Effective VC Dimension (12 pages)

Leon Bottou, Neuristique, Paris (France)
Vladimir N. Vapnik, ATT Bell Laboratories, Holmdel (NJ)

ABSTRACT:

The very idea of an ``Effective Vapnik Chervonenkis (VC) dimension'' relies
on the hypothesis that the relation between the generalization error and the
number of training examples can be expressed by a formula algebraically
similar to the VC bound.  This hypothesis calls for a serious discussion
since the traditional VC bound widely overestimates the generalization error.

In this paper we describe an algorithm and data dependent measure of
capacity. We derive a confidence interval on the difference between the
training error and the generalization error. This confidence interval is
much tighter than the traditional VC bound.

A simple change of the formulation of the problem yields this extra
accuracy: our confidence interval bounds the error difference between a
training set and a test set, rather than the error difference between a
training set and some hypothetical grand truth.  This ``transductive''
approach allows for deriving a data and algorithm dependent confidence
interval.


More information about the Connectionists mailing list