From ngisolfi at cs.cmu.edu Thu Jan 11 16:02:27 2018 From: ngisolfi at cs.cmu.edu (Nick Gisolfi) Date: Thu, 11 Jan 2018 16:02:27 -0500 Subject: [hackauton] Change the Dates: April 6-8 Message-ID: Hi Everyone, In order to put our strongest foot forward, the hackauton is instead being planned for April 6-8th. This gives us time to organize a big event. Please keep this in mind when making plans during that weekend. Hopefully we will have everyone participating. - Nick -------------- next part -------------- An HTML attachment was scrubbed... URL: From ngisolfi at cs.cmu.edu Thu Jan 11 17:30:08 2018 From: ngisolfi at cs.cmu.edu (Nick Gisolfi) Date: Thu, 11 Jan 2018 17:30:08 -0500 Subject: Website Upgrade Message-ID: Hi Everyone, autonlab.org is scheduled for an update which will include more comprehensive info about past and present projects and people at the lab. Here are links to two surveys everyone should complete. If you complete these surveys before the upgrade, we will enter in all the information for you. 1) People (Fill out one for your personal page on the site) https://goo.gl/forms/zfg53lAGpza9Ivsy2 2) Research Projects (Fill out as many times as you would like for your past/present projects) https://goo.gl/forms/50ZMizwtZ74ypDSc2 One of the short term goals of the website upgrade is to attract donations for the hackauton. Please complete these surveys so we can have a strong online presence. If you have any questions or if you think something is missing from the survey (especially the research projects form), please contact me directly. - Nick #hackauton @ auton.slack.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From awd at cs.cmu.edu Thu Jan 11 18:06:43 2018 From: awd at cs.cmu.edu (Artur Dubrawski) Date: Thu, 11 Jan 2018 18:06:43 -0500 Subject: Fwd: Reminder - Thesis Proposal - 1/12/18 - Manzil Zaheer - Representation Learning @ Scale In-Reply-To: <4da18d04-286c-e76d-6363-67f045c9a567@cs.cmu.edu> References: <4da18d04-286c-e76d-6363-67f045c9a567@cs.cmu.edu> Message-ID: if you're free please attend Manzil's presentation -------- Forwarded Message -------- Subject: Reminder - Thesis Proposal - 1/12/18 - Manzil Zaheer - Representation Learning @ Scale Date: Thu, 11 Jan 2018 17:09:00 -0500 From: Diane Stidle To: ml-seminar at cs.cmu.edu , Alex Smola , mccallum at cs.umass.edu /Thesis Proposal/ Date: January 12, 2018 Time: 3:00 PM Place: 8102 GHC Speaker: Manzil Zaheer Title: Representation Learning @ Scale Abstract: Machine learning techniques are reaching or exceeding human level performances in tasks like image classification, translation, and text-to-speech. The success of these machine learning algorithms have been attributed to highly versatile representations learnt from data using deep networks or intricately designed Bayesian models*. *_*Represen*_*tation learning* has also provided hints in neuroscience, e.g.?for understanding how humans might categorize objects. Despite these instances of success, many open questions remain. Data come in all shapes and sizes: not just as images or text, but also as point clouds, sets, graphs, compressed, or even heterogeneous mixture of these data types. In this thesis, we want to develop representation learning algorithms for such unconventional data types by leveraging their structure and establishing new mathematical properties. Representations learned in this fashion were applied on diversedomains and found to be competitive with task specific state-of-the-art methods. Once we have the representations, in various applications its interpretability is as crucial as its accuracy. Deep models often yield better accuracy, but require a large number of parameters, often notwithstanding the simplicity of the underlying data, rendering it uninterpretable which is highly undesirable in tasks like user modeling. On the other hand, Bayesian models produce sparse discrete representations, easily amenable to human interpretation. In this thesis, we want to explore methods *that are**capable of * *learning *mixed representations retaining best of both the worlds. Our experimental evaluations show that the proposed techniques compare favorably with several state-of-the-art baselines. Finally, one would want such interpretable representations to be inferred from large-scale data, however, often there is a mismatch between our computational resources and the statistical models. In this thesis, we want to bridge this gap by solutions based on a combination of modern computational techniques/data structures on one side and modified statistical inference algorithms on the other. We introduce new ways to parallelize, reduce look-ups, handle variable state space size, and escape saddle points. On latent variable models, like latent Dirichlet allocation (LDA), we find significant gains in performance. To summarize, in this thesis, we want to explore three major aspects of representation learning --- diversity: being able to handle different types of data, interpretability: being accessible to and understandable by humans, and scalablity: being able to process massive datasets in a reasonable time and budget. Thesis Committee: ? ? Barnabas Poczos, Co-Chair ? ? Ruslan Salakhutdinov, Co-Chair ? ? Alexander J Smola (Amazon) ? ? Andrew McCallum (UMass Amherst) Link to proposal document: http://manzil.ml/proposal.pdf? -- Diane Stidle Graduate Programs Manager Machine Learning Department Carnegie Mellon University diane at cs.cmu.edu 412-268-1299 -------------- next part -------------- An HTML attachment was scrubbed... URL: