Connectionists: New paper: STDP enables spiking neurons to detect hidden causes of their inputs.
Bernhard Nessler
nessler at igi.tu-graz.ac.at
Thu Nov 12 09:18:43 EST 2009
Dear colleagues,
I would like to draw your attention to a new paper describing a
surprising theoretically founded connection between spike-timing
dependent plasticity (STDP) and the high-level mathematical concept
of unsupervised learning using expectation maximization (EM).
On the basis of this principle one can achieve unsupervised
learning results in networks of spiking neurons that were previously
unattainable.
The paper is available online at http://www.igi.tugraz.at/psfiles/191.pdf
B. Nessler, M. Pfeiffer, W. Maass.
STDP enables spiking neurons to detect hidden causes of their inputs.
In Proc. of NIPS 2009: Advances in Neural Information Processing Systems,
volume 22. MIT Press, 2010.
Abstract:
The principles by which spiking neurons contribute to the astounding
computational power of generic cortical microcircuits, and how
spike-timing-dependent plasticity (STDP) of synaptic weights could
generate and maintain this computational function, are unknown. We show
here that STDP, in conjunction with a stochastic soft winner-take-all
(WTA) circuit, induces spiking neurons to generate through their
synaptic weights implicit internal models for subclasses (or “causes”)
of the high-dimensional spike patterns of hundreds of pre-synaptic
neurons. Hence these neurons will fire after learning whenever the
current input best matches their internal model. The resulting
computational function of soft WTA circuits, a common network motif of
cortical microcircuits, could therefore be a drastic dimensionality
reduction of information streams, together with the autonomous creation
of internal models for the probability distributions of their input
patterns. We show that the autonomous generation and maintenance of this
computational function can be explained on the basis of rigorous
mathematical principles. In particular, we show that STDP is able to
approximate a stochastic online Expectation-Maximization (EM) algorithm
for modeling the input data. A corresponding result is shown for Hebbian
learning in artificial neural networks.
Best regards,
Bernhard Nessler
--
=========================================================
DI Bernhard Nessler
Institute for Theoretical Computer Science
Graz University of Technology
Inffeldgasse 16b, A-8010 Graz, Austria
--------------------------------------------------------
nessler at igi.tugraz.at
http://www.igi.tugraz.at/
Tel.: ++43 316 873-5823
Fax: ++43 316 873-5805
=========================================================
More information about the Connectionists
mailing list