Are Neural Nets Turing Equivalent?
FRANKLINS%MEMSTVX1.BITNET@VMA.CC.CMU.EDU
FRANKLINS%MEMSTVX1.BITNET at VMA.CC.CMU.EDU
Wed Sep 19 17:01:00 EDT 1990
Here are some additional references on the "Turing
equivalence of neural networks question". The term
"neural network" will refer to networks that are
discrete in time and in activation values.
In their first paper, McCulloch and Pitts showed that
logical gates can easily be simulated by threshold
networks. They also claimed, but did not prove, Turing
equivalence.
W.S. McCulloch and W. Pitts, "A logical calculus
of the ideas immanent in nervous activity", Bull.
Math. Biophys. 5(1943) 115--133.
Hartley and Szu noted that finite neural networks were
computationally equivalent to finite state machines.
They also asserted Turing equivalence of potentially
infinite (unbounded) neural networks and sketched a
proof that a Turing machine can be simulated by a
neural network.
R. Hartley and H. Szu, "A Comparison of the
Computational Power of Neural Network Models",
in Proc. IEEE First International Conference
on Neural Networks (1987) III 17--22.
Max Garzon and I gave a detailed description of a
neural network simulation of an arbitrary Turing
machine. The network would stabilize if and only if the
Turing machine halts. Thus the stability problem for
neural networks turns out to be Turing unsolvable. One
could argue, I think, that our unbounded neural network
simulation of a Turing machine even has a finite
description.
Stan Franklin and Max Garzon, "Neural
Computability" in O. M. Omidvar, ed., Progress In
Neural Networks, vol 1, Ablex, Norwood NJ, 1990.
Unbounded neural networks (without finite descriptions)
are strictly more powerful than Turing machines. Such a
beast, if there were one, could solve the halting
problem, for example, by essentially reducing it to a
lookup table. But neural networks are computationally
equivalent to cellular automata for graphs of finite
bandwidth. Max and I proved this using a universal
neural network.
Max Garzon and Stan Franklin, "Computation on
graphs", in O. M. Omidvar, ed., Progress in Neural
Networks, vol 2, Ablex, Norwood NJ, 1990, to
appear.
Max Garzon and Stan Franklin, "Neural computability II",
Proc. 3rd Int. Joint. Conf. on Neural Networks,
Washington, D.C. 1989 I, 631-637
Stan Franklin
Math Sciences
Memphis State
Memphis TN 38152
BITNET:franklins at memstvx1
More information about the Connectionists
mailing list