Summary of responses: usefulness of chaotic dynamics

Pankaj Mehra p-mehra at uiuc.edu
Mon Feb 4 13:58:19 EST 1991


	**** PLEASE DO NOT FORWARD TO OTHER LISTS/BULLETIN BOARDS ****

A few notes:

1. If replying to my message did not work, try one of the e-mail addresses
   at the end of my original message.

2. The comment: ``Chaos is an antithesis of generalization, a defining
   trait of connectionist models,'' was mine, not Hopfield's.

3. All the responses so far seem to suggest that chaos is useful in an
   evolving system. We can have a more focussed discussion if we can answer:

   3a. Is chaos a precise quantitative way of stating one's ignorance of
	the dynamics of the process being modeled/controlled?

   3b. What methods are available for implementing controlled chaos?

   3c. How can chaos and learning be integrated in neural networks?

   Of course, discussion on cognitive/engineering significance of chaos
   is still welcome.

---------

RESPONSES RECEIVED

Paraphrased portions are enclosed in {}.

**********************************************************************
From: Richard Rohwer <rr at cstr.edinburgh.ac.uk>

I think that it is still an open question what sort of dynamics is
cognitively useful.  I can see the sense in Hopfield's intuition that
convergent dynamics is good for generalization, but this doesn't really
rule out chaotic dynamics, because although trajectories don't converge
to fixed points, they do converge to attractors.  Even if the
attractors are chaotic, they lie "arbitrarily close" (in an epsilon-
delta sense) to specific manifolds in state space which stay put.  So
there is a contracting mapping between the basin of attraction and the
attractor.

Anyway, I don't accept that generalization is the only phenomenon
in cognition.  I find it at least plausible that thought processes
do come to critical junctures at which small perturbations can
make large differences in how the thoughts evolve.

{ Pointers to relevant papers:

Steve Renals and Richard Rohwer, "A study of network dynamics",
	J. Statistical Physics, vol. 58, pp.825-847, 1990.

Stephen J. Renals, "Speech and Neural Network Dynamics", Ph.D. thesis,
	Edinburgh University, 1990.

Steve Renals, "Chaos in neural networks" in Neural Networks (Almeida and
	Wellekens, eds.), Lecture Notes in Computer Science 412, Springer
	-Verlag, Berlin, pp. 90-99, 1990. }

**********************************************************************
{some portions deleted}

From: Jordan B Pollack <pollack at cis.ohio-state.edu>

There is a lot more to chaos than Sensitivity to initial conditions;
there are self-organizing dynamics and computational properties beyond
simple limit-point systems.  Chaos is almost unavoidable in NN, and
has to be suppressed with artifacts like symettric weights and
synchronous recurrence. If it is so prevalent in nature, it must be
adaptive!  (If your heart converges, you die; if your brain converges,
you die.)

Hopfield is accurate in that neural networks which converge might be
as commercially useful as logic gates, but they won't address the
question of how high-dimensional dynamical systems self-organize into
complex algorithmic structures.

My research slogan might be "chaos in brain -> fractals in mind", and
believe that more complex dynamics than convergence are quite
necessary to getting non-trivial neural cognitive models.  Had one
paper in NIPS 1 outlining some research proposals, and 2 forthcoming
in NIPS3.

There are groups in Tel-Aviv, Warsaw, and Edinburgh, at least, working
on complex neuro-dynamics.

**********************************************************************
From: andreas%psych at Forsythe.Stanford.EDU (Andreas Weigend)

{Pointers to related publications:

Andreas S. Weigend, Bernardo A. Huberman, and David E. Rumelhart, "Predicting
	the future: a connectionist approach", International Journal of Neural
	Systems, vol. 1, pp. 193-209, 1990.

Andreas S. Weigend, Bernardo A. Huberman, and David E. Rumelhart,
	"Back-propagation, weight-elimination and time series prediction" in
	Proc. 1990 Connectionist Models Summer School, Morgan Kaufmann,
	pp. 105-116, 1990.

Andreas S. Weigend, Bernardo A. Huberman, and David E. Rumelhart,
	1990 Lectures in Complex Systems, ?? (eds. Nadel and Stein),
	Addison-Wesley, 1991.

Andreas S. Weigend, Bernardo A. Huberman, and David E. Rumelhart,
	"Predicting Sunspots and Currency Rates with Connectionist Networks",
	in Proc. NATO Workshop on Nonlinear Modeling and Forecasting
	(Santa Fe, Sept. 1990).

***********************************************************************
From: David Horn <HORN at vm.tau.ac.il>

I would like to point out the importance of dynamics which are
convergent on a short time scale and divergent on a long time scale.
We have worked on neural networks which display such behavior.
In particular we can model a system which converges to a set of fixed
points on a short time scale (thus performing some "useful"
computation), and is "free" to move between them on a longer time scale.
This kind of freedom stems from the unpredictability of chaotic systems.

{deleted; message directly mailed to Connectionists}

***********************************************************************
END OF RESPONSES


More information about the Connectionists mailing list