[CMU AI Seminar] April 5 at 12pm (Zoom) -- Uri Alon (CMU) -- Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval -- AI Seminar sponsored by Morgan Stanley

Asher Trockman ashert at cs.cmu.edu
Tue Apr 5 12:01:48 EDT 2022


Hi all,

The seminar today by Uri Alon on neuro-symbolic language modeling is
happening right now!

In case you are interested:
https://cmu.zoom.us/j/99510233317?pwd=ZGx4aExNZ1FNaGY4SHI3Qlh0YjNWUT09

Thanks,
Asher

On Fri, Apr 1, 2022 at 4:57 PM Asher Trockman <ashert at cs.cmu.edu> wrote:

> Dear all,
>
> We look forward to seeing you *next Tuesday (4/5)* from *1**2:00-1:00 PM
> (U.S. Eastern time)* for the next talk of our *CMU AI seminar*, sponsored
> by Morgan Stanley <https://www.morganstanley.com/about-us/technology/>.
>
> To learn more about the seminar series or see the future schedule, please
> visit the seminar website <http://www.cs.cmu.edu/~aiseminar/>.
>
> On 4/5, *Uri Alon *(CMU) will be giving a talk titled *"Neuro-Symbolic
> Language Modeling with Automaton-augmented Retrieval"* to share his work
> on improving retrieval-based language modeling.
>
> *Title*: Neuro-Symbolic Language Modeling with Automaton-augmented
> Retrieval
>
> *Talk Abstract*: Retrieval-based language models (R-LM) model the
> probability of natural language text by combining a standard language model
> (LM) with examples retrieved from an external datastore at test time. While
> effective, a major bottleneck of using these models in practice is the
> computationally costly datastore search, which can be performed as
> frequently as every time step. In this talk, I will present RetoMaton -
> retrieval automaton - which approximates the datastore search, based on (1)
> clustering of entries into "states", and (2) state transitions from
> previous entries. This effectively results in a weighted finite automaton
> built on top of the datastore, instead of representing the datastore as a
> flat list. The creation of the automaton is unsupervised, and a RetoMaton
> can be constructed from any text collection: either the original training
> corpus or from another domain. Traversing this automaton at inference time,
> in parallel to the LM inference, reduces its perplexity, or alternatively
> saves up to 83% of the nearest neighbor searches over kNN-LM (Khandelwal et
> al., 2020), without hurting perplexity.
>
> *Speaker Bio*: Uri Alon is a Postdoctoral Researcher at LTI, working with
> Prof. Graham Neubig on NLP and learning from source code. Previously, he
> obtained his PhD at the Technion (Israel), where he worked on modeling
> programming languages and graphs. Currently, he is also interested in the
> synergy of neural models with symbolic components such as retrieval,
> programs, and automata. His personal website is at https://urialon.ml.
> Feel free to reach out with any questions or comments about the talk.
>
> *Zoom Link*:
> https://cmu.zoom.us/j/99510233317?pwd=ZGx4aExNZ1FNaGY4SHI3Qlh0YjNWUT09
>
> Thanks,
> Asher Trockman
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/ai-seminar-announce/attachments/20220405/b1ce77b4/attachment.html>


More information about the ai-seminar-announce mailing list