Three *** UNRELATED *** TR's available

Eduardo Sontag sontag at control.rutgers.edu
Fri Jul 7 10:27:06 EDT 1995


(NOTE: The following three TR's are NOT in any way related to each other,
but announcements are being bundled into one, as requested by list moderator.)

******************************************************************************
Subject: TR (keywords: VC dimension lower bounds, feedforward neural nets)
The following preprint is available by FTP:

	Neural networks with quadratic VC dimension
	     Pascal Koiran and Eduardo Sontag

Abstract:

This paper shows that neural networks which use continuous activation functions
have VC dimension at least as large as the square of the number of weights w.
This result settles a long-standing open question, namely whether the
well-known O(w log w) bound, known for hard-threshold nets, also held for more
general sigmoidal nets.  Implications for the number of samples needed for
valid generalization are discussed.

Retrieval: FTP anonymous to:
  math.rutgers.edu
  cd pub/sontag
  file: quadratic-vc.ps.z

(compressed postscript file; for further information, see the files README and
CONTENTS in that directory)

(Note: also available as NeuroCOLT Technical Report NC-TR-95-044)

		**** NO HARDCOPY AVAILABLE ****
******************************************************************************
Subject: TR (keywords: VC dimension, learning, dynamical systems, recurrence)
The following preprint is available by FTP:

    Sample complexity for learning recurrent perceptron mappings
        	  Bhaskar Dasgupta and Eduardo Sontag

Abstract:

This paper deals with the learning-theoretic questions involving the
identification of linear dynamical systems (in the sense of control theory)
and especially with the binary-classification version of these, "recurrent
perceptron classifiers".  These latter classifiers generalize the classical
perceptron model.  They take into account those correlations and dependences
among input coordinates which arise from linear digital filtering.  The paper
provides tight theoretical bounds on sample complexity associated to the
fitting of recurrent perceptrons to experimental data.  The results are based
on VC-dimension theory and PAC learning, as well as on recent computational
complexity work in elimination methods.

Retrieval: FTP anonymous to:
  math.rutgers.edu
  cd pub/sontag
  file: vcdim-signlinear.ps.z
(compressed postscript file; for further information, see the files README and
CONTENTS in that directory)

Note: The paper is available also as DIMACS Technical Report 95-17, and can be
obtained by ftp to dimacs.rutgers.edu (IP address = 128.6.75.16),
login anonymous, in dir "pub/dimacs/TechnicalReports/TechReports"

              ****  NO HARDCOPY AVAILABLE ****
******************************************************************************
Subject: TR (keywords: feedforward networks, local minima, critical points)
The following preprint is available by FTP:

   Critical points for least-squares problems involving certain analytic
              functions, with applications to sigmoidal nets
		        Eduardo D. Sontag

Abstract:

This paper deals with nonlinear least-squares problems involving the fitting
to data of parameterized analytic functions.
For generic regression data, a general result establishes the countability,
and under stronger assumptions finiteness, of the set of functions giving rise
to critical points of the quadratic loss function.
In the special case of what are usually called ``single-hidden layer neural
networks,'' which are built upon the standard sigmoidal activation tanh(x)
(or equivalently 1/(1+e^{-x})), a rough upper bound for this cardinality
is provided as well.

Retrieval: FTP anonymous to:
  math.rutgers.edu
  cd pub/sontag
  file: crit-sigmoid.ps.z
(compressed postscript file; for further information, see the files README and
CONTENTS in that directory)

		**** NO HARDCOPY AVAILABLE ****
******************************************************************************
			Eduardo D. Sontag
	URL for Web HomePage: http://www.math.rutgers.edu/~sontag/
******************************************************************************




More information about the Connectionists mailing list