new paper in neuroprose
Wolfgang Maass
maass at igi.tu-graz.ac.at
Mon Jun 21 09:43:41 EDT 1993
FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/maass.super.ps.Z
The file maass.super.ps.Z is now available for copying
from the Neuroprose repository. This is a 7-page long paper.
Hardcopies are not available.
NEURAL NETS WITH SUPERLINEAR VC-DIMENSION
by
Wolfgang Maass
Institute for Theoretical Computer Science
Technische Universitaet Graz, A-8010 Graz, Austria
email: maass at igi.tu-graz.ac.at
Abstract:
We construct arbitrarily large feedforward neural nets of depth 3
(i.e. with 2 hidden layers) and O(w) edges that have a VC-dimension of
at least w log w. The same construction can be carried out for any
depth larger than 3.
This construction proves that the well-known upper bound for the
VC-dimension of a neural net by Cover, Baum, and Haussler is in
fact asymptotically optimal for any depth 3 or larger.
The Vapnik-Chervonenkis dimension (VC-dimension) is an important
parameter of any neural net, since it predicts how many training
examples are needed for training the net (in Valiant's model for
probably approximately correct learning).
One may also view our result as mathematical evidence for some type
of "connectionism thesis": that a network of neuron-like elements is
more than just the sum of its elements. Our result shows that in a large
neural net a single weight contributes more than a constant to the
VC-dimension of the neural net, and that its contribution may increase with
the total size of the neural net.
The current paper improves the corresponding result by the author from
last year (for depth 4), and it provides the first complete write-up
of the construction.
More information about the Connectionists
mailing list