Connectionists: Fwd: [seminar.wwtns] World wide VVTNS series: Wednesday, April 10 at 11am (EDT), Michael Buice, Allen Institute |Biologically motivated learning dynamics: parallel architectures and nonlinear Hebbian plasticity

David Hansel dhansel0 at gmail.com
Sun Apr 7 07:30:29 EDT 2024


[image: VVTNS.png]
https://www.wwtns.online - on twitter: wwtns at TheoreticalWide

You are cordially invited to the lecture  given by

Michael Buice

Allen Institute
on the topic of

"Biologically motivated learning dynamics:
parallel architectures and nonlinear Hebbian plasticity*"*

The lecture will be held on zoom on *April 10, 2024*, at *11:00 am EDT *

To receive the zoom link: https://www.wwtns.online/register-page

*Abstract:  *Learning in biological systems takes place in contexts and
with dynamics not often accounted for by simple models. I will describe the
learning dynamics of two model systems that incorporate either
architectural or dynamic constraints from biological observations. In the
first case, inspired by the observed mesoscopic structure of the mouse
brain as revealed by the Allen Mouse Brain Connectivity Atlas, as well as
multiple examples of parallel pathways in mammalian brains, I present a
mathematical analysis of learning dynamics in networks that have parallel
computational pathways driven by the same cost function. We use the
approximation of deep linear networks with large hidden layer sizes to show
that, as the depth of the parallel pathways increases, different features
of the training set (defined by the singular values of the input-output
correlation) will typically concentrate in one of the pathways. This result
is derived analytically and demonstrated with numerical simulation with
both linear and non-linear networks. Thus, rather than sharing stimulus and
task features across multiple pathways, parallel network architectures
learn to produce sharply diversified representations with specialized and
specific pathways, a mechanism which may hold important consequences for
codes in both biological and artificial systems. In the second case, I
discuss learning dynamics in a generalization of Hebbian rules and show
that these rules allow a neuron to learn tensor decompositions of
higher-order input correlations. Unlike the case of the Oja rule and PCA,
the resulting learned representation is not unique but selects amongst the
tensor eigenvectors according to initial conditions.

*About VVTNS : Created as the World Wide Neuroscience Seminar (WWTNS) in
November 2020 and renamed in homage to Carl van Vreeswijk in Memoriam
(April 20, 2022), its aim is to be a platform to exchange ideas among
theoreticians. Speakers have the occasion to talk about theoretical aspects
of their work which cannot be discussed in a setting where the majority of
the audience consists of experimentalists. The seminars, **held on
Wednesdays at 11 am ET,**  are 45-50 min long followed by a discussion. The
talks are recorded with authorization of the speaker and are available to
everybody on our YouTube channel.*
ᐧ
ᐧ
ᐧ
ᐧ

ᐧ


-- 
'Life is good ..' (Carl van Vreeswijk, 1962-2022)
---------------------------------------
David Hansel
Directeur de Recherche  au CNRS
Co-Group leader
Cerebral Dynamics Plasticity and Learning lab., CNRS
45 rue des Saints Peres 75270 Paris Cedex 06
Tel (Cell):   +33 607508403 - Fax (33).1.49.27.90.62


ᐧ
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20240407/c7e7ca41/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: VVTNS.png
Type: image/png
Size: 41084 bytes
Desc: not available
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20240407/c7e7ca41/attachment.png>


More information about the Connectionists mailing list