[AI Seminar] ai-seminar-announce Digest, Vol 79, Issue 3

Adams Wei Yu weiyu at cs.cmu.edu
Mon Dec 11 07:17:46 EST 2017


A gentle reminder that the talk will happen tomorrow (Tuesday) noon. This
is the last AI seminar in 2017.

On Sun, Dec 10, 2017 at 9:00 AM, <ai-seminar-announce-request at cs.cmu.edu>
wrote:

> Send ai-seminar-announce mailing list submissions to
>         ai-seminar-announce at cs.cmu.edu
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         https://mailman.srv.cs.cmu.edu/mailman/listinfo/ai-
> seminar-announce
> or, via email, send a message with subject or body 'help' to
>         ai-seminar-announce-request at cs.cmu.edu
>
> You can reach the person managing the list at
>         ai-seminar-announce-owner at cs.cmu.edu
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of ai-seminar-announce digest..."
>
>
> Today's Topics:
>
>    1.  AI Seminar sponsored by Apple -- Veeranjaneyulu  Sadhanala --
>       Dec 12 (Adams Wei Yu)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Sun, 10 Dec 2017 00:06:40 -0800
> From: Adams Wei Yu <weiyu at cs.cmu.edu>
> To: ai-seminar-announce at cs.cmu.edu
> Cc: Veeranjaneyulu Sadhanala <veeranjaneyulus at gmail.com>
> Subject: [AI Seminar] AI Seminar sponsored by Apple -- Veeranjaneyulu
>         Sadhanala -- Dec 12
> Message-ID:
>         <CABzq7epyXB7yEg11M5RFnvt5PQtuhfCdopduPKMptYz620jeJA at mail.
> gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> Dear faculty and students,
>
> We look forward to seeing you next Tuesday, Dec 12, at noon in NSH 3305 for
> AI Seminar sponsored by Apple. To learn more about the seminar series,
> please visit the AI Seminar webpage <http://www.cs.cmu.edu/~aiseminar/>.
>
> On Tuesday, Veeranjaneyulu Sadhanala <http://www.cs.cmu.edu/~vsadhana/>
> will give the following talk:
>
> Title: Escaping saddle points in neural network training and other
> non-convex optimization problems
>
> Abstract:
>
> In non-convex optimization problems, first-order based methods can get
> stuck at saddle points which are not even local minima. The generalization
> error at saddle points is typically large and hence it is important to move
> away from them. We discuss recently developed algorithms to escape saddle
> points. In particular, we discuss gradient descent perturbed with additive
> isotropic noise and Newton method with cubic regularization. They converge
> to \epsilon-second order stationary points (informally, local minima) in
> O(polylog(d) / \epsilon^2) time and O(1/ \epsilon^1.5) iterations
> respectively under some conditions on the structure of the objective
> function.
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://mailman.srv.cs.cmu.edu/pipermail/ai-seminar-
> announce/attachments/20171210/fa91fcc6/attachment-0001.html>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> ai-seminar-announce mailing list
> ai-seminar-announce at cs.cmu.edu
> https://mailman.srv.cs.cmu.edu/mailman/listinfo/ai-seminar-announce
>
> ------------------------------
>
> End of ai-seminar-announce Digest, Vol 79, Issue 3
> **************************************************
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/ai-seminar-announce/attachments/20171211/a2aec82f/attachment.html>


More information about the ai-seminar-announce mailing list