TR available - Learning problems for recurrent nets

Eduardo Sontag sontag at control.rutgers.edu
Fri Dec 20 11:31:58 EST 1996


          VAPNIK-CHERVONENKIS DIMENSION OF RECURRENT NEURAL NETWORKS

			Pascal Koiran, LIP-Lyon, France
			Eduardo D. Sontag, Rutgers, USA

 DIMACS Tech Report 96-56. (Summary to appear in Proceedings of Third European
 Conference on Computational Learning Theory, Jerusalem, March 17-19, 1997.)

			  ABSTRACT

This paper provides lower and upper bounds for the VC dimension of recurrent
networks. Several types of activation functions are discussed, including
threshold, polynomial, piecewise-polynomial and sigmoidal functions.  The
bounds depend on two independent parameters: the number w of weights in the
network, and the length k of the input sequence.  Ignoring multiplicative
constants, the main results say roughly the following:

  1. For architectures whose activation is any fixed nonlinear polynomial, the
     VC dimension is proportional to wk.

  2. For architectures whose activation is any fixed piecewise polynomial, the
     VC dimension is between wk and w**2 k.

  3. For architectures with threshold activations, the VC dimension is between
     wlog(k/w) and min{wklog(wk),w**2+wlog(wk)}.

  4. For the standard sigmoid tanh(x), the VC dimension is between wk and
     w**4 k**2.  

============================================================================
The paper can be retrieved from the DIMACS archive:

    http://dimacs.rutgers.edu/TechnicalReports/1996.html

as well as from Sontag's HomePage:

    http://www.math.rutgers.edu/~sontag

(follow link to "online papers").
Many other related papers can be also found at this latter site.

If Web access if inconvenient, it is also possible to use anonymous FTP:

   ftp dimacs.rutgers.edu
   login: anonymous
   cd pub/dimacs/TechnicalReports/TechReports/1996/
   bin
   get 96-56.ps.gz

Once file is retrieved, use gunzip to uncompress and then print as postscript.
============================================================================
Comments welcome.

Happy connecting holidays!


More information about the Connectionists mailing list