Parallelism, Real vs. Simulated: A Query

aboulang@WILMA.BBN.COM aboulang at WILMA.BBN.COM
Mon Oct 9 19:16:28 EDT 1989


Stevan Harnad (harnad at confidence.princeton.edu) writes:

  I have a simple question: What capabilities of PDP systems do and
  do not depend on the net's actually being implemented in parallel,
  rather than just being serially simulated? Is it only speed and
  capacity parameters, or something more?


The more general question of the difference between parallel
computation (under any means) and serial computation has interested me
for a number of years.  It is clear that synchronous parallelism and
serial computation are the same modulo the speedup. The harder question
is whether asynchronous parallelism computes the same way modulo the
speedup.

The computability results for asynchronous computation had their start
with David Muller ("Infinite Sequences and Finite Machines", Switching
Circuit Theory and Logical Design: Proc. 4th Ann. Symp. IEEE pp
3-16, 1963) who used as an example 2 asynchronous fed-back NOT circuits.
The analysis led to the work in omega-languages which are languages
over the power sets of all the possible states for such a combined
circuit would produce. The computability results for omega-languages are
that they are no more powerful than sequential Turing machines.

Another line of reasoning is to ask whether parallel-asynchronous
dynamical computational systems have a different kind of dynamics than
sequential dynamical systems and whether the computational abilities
of the system can make use of this difference in dynamics.
Asynchronization or time-delays can act as a source of noise to help
settle stochastic networks. See for example the discussion of time
delays in "Separating Figure from Ground with a Boltzmann Machine"
Terrence Sejnowski and Geoffrey Hinton, In Vison Bra in and
Cooperative Computation, M.A. Arbib & A.R. Hanson eds., MIT Press
1985, and "Effects of Connection Delays in Two State Model Neural
Circuits", Peter Conwell, IEEE First Conference on Neural Networks,
III-95 - III-104. Note that the term "asynchronous updates" in the
neural network research that comes from the spin-glass people normally
means sequential. There has been some work on the convergence of
neural nets with asynchronous-parallel updates. See for example
'"Chaotic Relaxation" in Concurrently Asynchronous Neurodynamics',
Jacob Barhen and Sandeep Gulati, IJCNN 1989, pp I-619 - I-626, and
"Partially Asynchronous, Parallel Algorithms for Network Flow and
Other Problems", P. Tseng, D.P. Bertsekas, and J.N. Tsitsiklis, Center
for Intelligent Control Systems Report CICS-P-91, November 1988.  The
dynamics of such systems is related to the study of time-delay
equations in the dynamical system literature. Work has been done on
the routes to chaos, etc. in such equations with constant delay, but
little or no work has been done with variable delay; which is the case
with asynchronous parallelism. This is an area that I am actively
studying.  Finally, the computational abilities of general analog
dynamical systems have been studied by several people including
Bradley Dickenson, EE Dept., at Princton, so you may want to talk to
him. His results indicate that the hardness of NP-complete problems
translate into exponential scaling of analog resources.

I believe that some further insight to your question can be had by
using Kolmogorov-Chaitin algorithmic complexity applied to dynamical
systems that compute. Algorithmic complexity has been applied to
dynamical systems to help distinguish and separate the notions of
determinism, and randomness is such systems. See the excellent paper,
"How Random is a Coin Toss", Joseph Ford, April 1983 40-47. One way
to summarize this work is to say that a pseudo-random number generator
(which are only iterated nonlinear maps) could in principle generate
truly random numbers if seeded with an infinite-algorithmic-complexity
number. Another fact to appreciate is that most of the real line is
made up of numbers that are algorithmically-incompressible - that is
they have a maximal algorithmic complexity for their length as decimal
or binary expansions. Irrational numbers would take an infinite amount
of time and space resources to compute their expansions - this
space-time notion is a variant of the algorithmic complexity measure
sometimes called Q-logical depth. 

One would think that a computational system that could use such a
powerful number could be used to compute beyond the capabilities
(either in speed or in computability) than computational systems that
don't have access to such numbers. (For example, the digits of
Chaitin's OMEGA number solves the halting problem by its very
definition. See for example "Randomness in Arithmetic", Gregory
Chaitin, Scientific American, July 1988, 80-85.)

In my mind the question of computability for analog and
parallel-asynchronous architectures is precisely whether these
constructs can implicitly or explicitly use numbers like Chaitin's
OMEGA, to do computation faster, to scale better, or to compute
normally incomputable questions than standard computer models.  An
example of explicit representation is representing a number like OMEGA
as an analog value. Due to the fact that values at some level become
quantal makes this unlikely. An example of an implicit representation
is the possible irrational timing relationships between the
self-timings of the elements of an asynchronous computing device. I
have been thinking about this implicit possibility for a while, but the
quantization of space-time at Plank lengths would eliminate this too.
I have not turned my back on this since this just prevents us from
using infinite-algorithmic complexity numbers. I still think we can do
something with large algorithmic-complexity numbers - in either speed
or scaling.

Some indication of how hard it would be to go beyond normal "digital"
computational devices in their abilities comes from a line of inquiry
in dynamical systems. This started when people asked a natural
question, "Is it reasonable to simulate chaos on the computer?."
Remarkably, the trajectories of a dynamical system simulation on a
computer can "shadow" true orbits for very long lengths of time. See
for example "In the Shadows of Chaos", Ivar Peterson, Science News,
Vol 134, December 3 1988, 360-361, and "Numerical Orbits of Chaotic
Processes Represent True Orbits", Stephen Hammel, James Yorke, and
Celso Grebogi, Bulletin of the AMS, Vol 19, No 2, October 1988,
465-469. A very similar question is being explored by researchers in
quantum chaos. Here the phenomena is called "phase mixing" and
quantum system (which cannot have chaos given the its mathematical
form) will mix in a very similar way its analog classical system will,
but is reversible - unlike its ergodic counterpart. For very long
periods of time quantum systems can emulate chaotic classical systems.
See for example "Classical Mechanics, Quantum Mechanics, and the Arrow
of Time", T.A. Heppenheimer, Mosiac, Vol 20. No. 2, Summer 1989, 2-11.

In closing, questions similar to yours have been asked about computers
based on the rules of Quantum Mechanics. (Quantum Computers and
computing-in-the-small was an area of investigation of Richard Feynman
during his last years. Richard Feynman, John Hopfield, and Carver Mead
did a course in the physics of computation which was a seed to Carver
Mead's new book, "Analog VLSI and Neural Systems".) David Deutsch has
claimed in his paper "Quantum Theory, the Church-Turing Principle and
the Universal Quantum Computer", Proc. R. Soc. Lond., A400, 1985,
97-117 that quantum computers would be more powerful (in the
computability sense) than the classical Turing machine. An example of
this power is using a quantum computer to compute truly random
numbers, or to prove that the Many-Worlds Interpretation of QM is the
true one. Deutsch proposes a modified version of Church-Turing
hypothesis to cover the case of quantum computers or what he calls
"quantum parallelism". Contrasting to this is the work of T. Erber,
who has been looking evidence for pseudo-random, as opposed to random,
signatures in single-atom resonance experiments.  See for example
"Randomness in Quantum Mechanics - Nature's Ultimate Crytogram?", T.
Erber & S. Putterman, Nature, Vol 318, No 7, November 1985, 41-43.




More information about the Connectionists mailing list