[CMU AI Seminar] Special! February 29 at 12pm (GHC 6115 & Zoom) -- Simon Du (U. Washington) -- How Over-Parameterization Slows Down Gradient Descent -- AI Seminar sponsored by SambaNova Systems

Asher Trockman ashert at cs.cmu.edu
Tue Feb 27 20:24:50 EST 2024


*Want to meet with Simon? Please send me an email to get the signup sheet.*

Dear all,

We look forward to seeing you *this Thursday (2/29)* from *1**2:00-1:00 PM
(U.S. Eastern time)* for the next talk of this semester's *CMU AI Seminar*,
sponsored by SambaNova Systems <https://sambanova.ai/>. The seminar will be
held in GHC 6115 *with pizza provided *and will be streamed on Zoom.

To learn more about the seminar series or to see the future schedule,
please visit the seminar website <http://www.cs.cmu.edu/~aiseminar/>.

On this Thursday (2/29), *Simon Du* (U. Washington) will be giving a talk
titled *"**How Over-Parameterization Slows Down Gradient Descent**"*.

*Title*: How Over-Parameterization Slows Down Gradient Descent

*Talk Abstract*: We investigate how over-parameterization impacts the
convergence behaviors of gradient descent through two examples. In the
context of learning a single ReLU neuron, we prove that the convergence
rate shifts from $exp(−T)$ in the exact-parameterization scenario to an
exponentially slower $1/T^3$ rate in the over-parameterized setting. In the
canonical matrix sensing problem, specifically for symmetric matrix sensing
with symmetric parametrization, the convergence rate transitions from
$exp(−T)$ in the exact-parameterization case to $1/T^2$ in the
over-parameterized case. Interestingly, employing an asymmetric
parameterization restores the $exp(−T)$ rate, though this rate also depends
on the initialization scaling. Lastly, we demonstrate that incorporating an
additional step within a single gradient descent iteration can achieve a
convergence rate independent of the initialization scaling.

*Speaker Bio:* Simon S. Du is an assistant professor in the Paul G. Allen
School of Computer Science & Engineering at the University of Washington.
His research interests are broadly in machine learning, such as deep
learning, representation learning, and reinforcement learning. Prior to
starting as faculty, he was a postdoc at the Institute for Advanced Study
of Princeton. He completed his Ph.D. in Machine Learning at Carnegie Mellon
University. Simon's research has been recognized by a Samsung AI Researcher
of the Year Award, an Intel Rising Star Faculty Award, an NSF CAREER award,
a Distinguished Dissertation Award honorable mention from CMU, among others.

*In person: *GHC 6115
*Zoom Link*:
https://cmu.zoom.us/j/99510233317?pwd=ZGx4aExNZ1FNaGY4SHI3Qlh0YjNWUT09

Thanks,
Asher Trockman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/ai-seminar-announce/attachments/20240227/83a1d4c4/attachment.html>


More information about the ai-seminar-announce mailing list