papers available

Dezhe Jin djin at MIT.EDU
Sat Oct 26 22:23:38 EDT 2002


Dear Connectionists,

  The following two papers may be of interest to some of you. They can be
downloaded at http://hebb.mit.edu/~djin/index.html.

  Thanks!

  -Dezhe Jin

1. Fast Convergence of Spike Sequences to Periodic Patterns in Recurrent
Networks, Dezhe Z. Jin, Physical Review Letters, 89, 208102 (2002).

Abstract:

The dynamical attractors are thought to underlie many biological functions
of recurrent neural networks. Here we show that stable periodic spike
sequences with precise timings are the attractors of the spiking dynamics
of recurrent neural networks with global inhibition. Almost all spike
sequences converge within a finite number of transient spikes to these
attractors. The convergence is fast, especially when the global inhibition
is strong. These results support the possibility that precise
spatiotemporal sequences of spikes are useful for information encoding and
processing in biological neural networks.


2. Fast computation with spikes in a recurrent neural network, Dezhe Z.
Jin and H. Sebastian Seung, Physical Review E, 65, 051922 (2002).

Abstract:

Neural networks with recurrent connections are sometimes regarded as too
slow at computation to serve as models of the brain. Here we analytically
study a counterexample, a network consisting of N integrate-and-fire
neurons with self excitation, all-to-all inhibition, instantaneous
synaptic coupling, and constant external driving inputs. When the
inhibition and/or excitation are large enough, the network performs a
winner-take-all computation for all possible external inputs and initial
states of the network. The computation is done very quickly: As soon as
the winner spikes once, the computation is completed since no other
neurons will spike. For some initial states, the winner is the first
neuron to spike, and the computation is done at the first spike of the
network. In general, there are M potential winners, corresponding to the
top M external inputs. When the external inputs are close in magnitude, M
tends to be larger. If M.1, the selection of the actual winner is strongly
influenced by the initial states. If a special relation between
the excitation and inhibition is satisfied, the network always selects the
neuron with the maximum external input as the winner.


=============================================================================

Dezhe Z. Jin, Ph.D.
Postdoctoral Fellow
Seung Lab, Dept. of Brain and Cognitive Sciences, M.I.T.

=============================================================================







More information about the Connectionists mailing list