[CMU AI Seminar] May 10 at 12pm (Zoom) -- Albert Gu (Stanford) -- Efficiently Modeling Long Sequences with Structured State Spaces -- AI Seminar sponsored by Morgan Stanley

Asher Trockman ashert at cs.cmu.edu
Mon May 9 11:31:02 EDT 2022


Dear all,

We look forward to seeing you *tomorrow, this Tuesday (5/10)* from
*1**2:00-1:00
PM (U.S. Eastern time)* for the next talk of our *CMU AI seminar*,
sponsored by Morgan Stanley
<https://www.morganstanley.com/about-us/technology/>.

To learn more about the seminar series or see the future schedule, please
visit the seminar website <http://www.cs.cmu.edu/~aiseminar/>.

*Tomorrow* (5/10), *Albert Gu *(Stanford) will be giving a talk titled
*"**Efficiently
Modeling Long Sequences with Structured State Spaces**" *to share his work
proposing the S4 model, which handles long-range dependencies
mathematically and empirically, and can be computed very efficiently.

*Title*: Efficiently Modeling Long Sequences with Structured State Spaces

*Talk Abstract*: A central goal of sequence modeling is designing a single
principled model that can address sequence data across a range of
modalities and tasks, particularly on long-range dependencies.  Although
conventional models including RNNs, CNNs, and Transformers have specialized
variants for capturing long dependencies, they still struggle to scale to
very long sequences of 10000 or more steps.  This talk introduces the
Structured State Space sequence model (S4), a simple new model based on the
fundamental state space representation $x'(t) = Ax(t) + Bu(t), y(t) = Cx(t)
+ Du(t)$. S4 combines elegant properties of state space models with the
recent HiPPO theory of continuous-time memorization, resulting in a class
of structured models that handles long-range dependencies mathematically
and can be computed very efficiently. S4 achieves strong empirical results
across a diverse range of established benchmarks, particularly for
continuous signal data such as images, audio, and time series.

*Speaker Bio*: Albert Gu is a final year Ph.D. candidate in the Department
of Computer Science at Stanford University, advised by Christopher Ré. His
research broadly studies structured representations for advancing the
capabilities of machine learning and deep learning models, with focuses on
structured linear algebra, non-Euclidean representations, and theory of
sequence models. Previously, he completed a B.S. in Mathematics and
Computer Science at Carnegie Mellon University.

*Zoom Link*:
https://cmu.zoom.us/j/99510233317?pwd=ZGx4aExNZ1FNaGY4SHI3Qlh0YjNWUT09

Thanks,
Asher Trockman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/ai-seminar-announce/attachments/20220509/ac3a06ff/attachment.html>


More information about the ai-seminar-announce mailing list