Turing Machines and Conenctionist networks

Nici Schraudolph schraudo%cs at ucsd.edu
Thu Jan 18 13:08:17 EST 1990


>From: Jim Hendler <hendler at cs.umd.edu>
	
>We've sketched out a proof that shows that non-recurrent
>back-propagation learning cannot be Turing equivalent (we
>can show a class of Turing computable functions which such
>a machine could not learn [...]

Wait a minute - did anybody ever claim that backprop could LEARN
any Turing-computable function?  It seems clear that this is not
the case: given that backprop is a gradient descent method, all
you have to do is construct a function whose solution in weight
space is surrounded by a local minimum "moat" in the error surface.

The question was whether a neural net could COMPUTE any Turing-
-computable function, given the right set of weights A PRIORI.
The answer to that depends on what class of architectures you
mean by "neural net": in general such nets are obviously Turing
equivalent since you can construct a Turing Machine from connec-
tionist components; more restricted classes such as one hidden
layer feedforward nets are where it gets interesting.

--
Nici Schraudolph, C-014                nschraudolph at ucsd.edu
University of California, San Diego    nschraudolph at ucsd.bitnet
La Jolla, CA 92093                     ...!ucsd!nschraudolph


More information about the Connectionists mailing list