number of hidden neurons

Eduardo Sontag sontag at fermat.rutgers.edu
Tue Nov 14 16:29:52 EST 1989


In the context of the discussions about numbers of hidden neurons, it seems
relevant to point out a short note of mine to appear in Neural Computation,
Number 4, entitled:
	SIGMOIDS DISTINGUISH MORE EFFICIENTLY THAN HEAVISIDES
I prove (this is the abstract):
  Every dichotomy on a $2k$-point set in $\R^N$ can be implemented by a neural
  net with a single hidden layer containing $k$ sigmoidal neurons.  If the
  neurons were of a hardlimiter (Heaviside) type, $2k-1$ would be in general
  needed.
So one can save half as many neurons using sigmoids, a fact which might not be
totally obvious at first sight (but which is indeed easy to prove).

The first paragraph of the intro is:

  The main point of this note is to draw attention to the fact mentioned in the
  title, that sigmoids have different recognition capabilities than
  hard-limitting nonlinearities.  One way to exhibit this difference is through
  a worst-case analysis in the context of binary classification, and this is
  done here.  Results can also be obtained in terms of VC dimension, and work
  is in progress in that regard.

(Added note: now established that in 2-d, VC dimension of k-neuron nets is 4k.
This is not written up yet, though.  I also have an example of a family of
boolean functions that can be computed with a fixed number of sigmoidal hidden
units but which --I conjecture-- needs a growing number of Heavisides.)

-eduardo
Eduardo D. Sontag

(Phone: (201)932-3072; dept.: (201)932-2390)
sontag at fermat.rutgers.edu
...!rutgers!fermat.rutgers.edu!sontag
sontag at pisces.bitnet


More information about the Connectionists mailing list