TR available: Complexity of Training
Siegelmann Hava
hava at bimacs.cs.biu.ac.il
Mon Dec 13 13:46:31 EST 1993
FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/filename.ps.Z
The file siegelmann.training.ps.Z is now available for copying from the
Neuroprose repository:
=====================================================================
On the Complexity of Training Neural Networks with Continuous
Activation Functions (30 pages)
Bhaskar DasGupta Hava T. Siegelmann Eduardo D. Sontag
Minnesota Bar-Ilan Rutgers
======================================================================
We deal with computational issues of loading a fixed-architecture neural
network with a set of positive and negative examples.
This is the first result on the hardness of loading networks which do
not consist of the binary-threshold neurons, but rather utilize a
particular continuous activation function, commonly used in the neural
network literature.
We observe that the loading problem is polynomial-time if the input
dimension is constant. Otherwise, however, it any
possible learning algorithm based on particular fixed architectures
faces severe computational barriers.
Similar theorems have already been proved by Megiddo and by Blum and
Rivest, to the case of binary-threshold networks only.
Our theoretical results lend further justification to the use of
incremental (architecture-changing) techniques for training networks
More information about the Connectionists
mailing list