Papers on computational analysis of dynamic synapses

Thomas Natschlaeger tnatschl at igi.tu-graz.ac.at
Thu Nov 23 09:00:13 EST 2000


Dear Connectionists,

The following two papers on computational analysis of dynamic synapses will be
presented at the NIPS 2000 conference. Comments are welcome.

        Gzipped postscript and PDF versions can be found at:

       http://www.igi.TUGraz.at/igi/tnatschl/publications.html

Sincerely

  Thomas Natschlaeger

----------------------------------------------------------------------
FINDING THE KEY TO A SYNAPSE

T. Natschlaeger and W. Maass

URLs: http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-nips00.ps.gz
      http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-poster.ps.gz
      http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-poster.pdf

ABSTRACT:

Experimental data have shown that synapses are heterogeneous:
different synapses respond with different sequences of amplitudes of
postsynaptic responses to the same spike train. Neither the role of
synaptic dynamics itself nor the role of the heterogeneity of synaptic
dynamics for computations in neural circuits is well understood.

We present in this article two computational methods that make it
feasible to compute for a given synapse with known synaptic parameters
the spike train that is optimally fitted to the synapse in a certain
sense. One of these methods is based on dynamic programming (similar
as in reinforcement learning), the other one on sequential quadratic
programming. With the help of these methods one can compute for
example the temporal pattern of a spike train (with a given number of
spikes) that produces the largest sum of postsynaptic responses for a
specific synapse.  Several other applications are also discussed.

To our surprise we find that most of these optimally fitted spike
trains match common firing patterns of specific types of neurons that
are discussed in the literature. Furthermore optimally fitted spike
trains are rather specific to a certain synapse ("the key to this
synapse") in the sense that they exhibit a substantially smaller
postsynaptic response on any other of the mayor types of synapses
reported in the literature. This observation provides the first
glimpse at a possible functional role of the specific combinations of
synapse types and neuron types that was recently found in (Gupta,
Wang, Markram, Science, 2000).

Our computational analysis provides the platform for a better
understanding of the specific role of different parameters that
control synaptic dynamics, because with the help of the computational
techniques that we have introduced one can now see directly how the
temporal structure of the optimal spike train for a synapse depends on
the individual synaptic parameters. We believe that this inverse
analysis is essential for understanding the computational role of
neural circuits.

-------------------------------------------------------------------------

PROCESSING OF TIME SERIES BY NEURAL CIRCUITS WITH BIOLOGICALLY
REALISTIC SYNAPTIC DYNAMICS

T. Natschlaeger, W. Maass, E. D. Sontag, and A. M. Zador

URLs: http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-nips00.ps.gz
      http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-poster.ps.gz
      http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-poster.pdf

ABSTRACT:

Experimental data show that biological synapses behave quite
differently from the symbolic synapses in common artificial neural
network models. Biological synapses are dynamic, i.e., their
``weight'' changes on a short time scale by several hundred percent in
dependence of the past input to the synapse. In this article we
explore the consequences that this synaptic dynamics entails for the
computational power and adaptive capability of feedforward neural
networks.

Our analytical results show that the class of nonlinear filters that
can be approximated by neural networks with dynamic synapses, even
with just a single hidden layer of sigmoidal neurons, is remarkably
rich. It contains every time invariant filter with fading memory,
hence arguable every filter that is potentially useful for a
biological organism.  This result is robust with regard to various
changes in the model for synaptic dynamics. Furthermore we show that
simple gradient descent suffices to approximate a given quadratic
filter by a rather small neural network with dynamic synapses.

The computer simulations we performed show that in fact their
performance is slightly better than that of previously considered
artificial neural networks that were designed for the purpose of
yielding efficient processing of temporal signals, without aiming at
biological realism. We have tested dynamic networks on tasks such as
the learning of a randomly chosen quadratic filter, as well as on the
system identification task used in (Back and Tsoi, 1993), to
illustrate the potential of our new architecture.

We also address the question which synaptic parameters are essential
for a network with dynamic synapses to be able to learn a particular
target filter. We found that neither just plasticity in the synaptic
dynamics nor just plasticity of the maximal amplitude alone yields
satisfactory results.  However a simple gradient descent learning
algorithm that tunes both types of parameters simultaneously yields
good approximation capabilities.




More information about the Connectionists mailing list