From krw at andrew.cmu.edu Wed Dec 5 10:33:57 2007 From: krw at andrew.cmu.edu (Karen Widmaier) Date: Wed, 5 Dec 2007 10:33:57 -0500 Subject: [Research] lab meeting on Thursday Dec 6th at noon In-Reply-To: <474F71E6.407@cs.cmu.edu> References: <474F71E6.407@cs.cmu.edu> Message-ID: <00b101c83754$40699160$b7b10280@adm.ri.cmu.edu> Hello, Tomorrow's meeting will be held in Room 2507 at Noon. Please send a reply to this email if you are able to attend. Thanks, Karen Karen Widmaier Robotics Institute NSH 3128 x7551 -----Original Message----- From: research-bounces at autonlab.org [mailto:research-bounces at autonlab.org] On Behalf Of Artur Dubrawski Sent: Thursday, November 29, 2007 9:14 PM To: research at autonlab.org Subject: [Research] lab meeting on Thursday Dec 6th at noon Hello, In some countries, December 6th is celebrated as St. Nicholas' day. St. Nick is a useful fellow as he brings each child a little something on that day (it's just a teaser, they get much much more stuff for Christmas). We are in luck this year as our little something will be an enlightening presentation by Brent titled "How far is the nearest coffee shop?". Does it sound as provocative to you as it does to me? Hope to see you all on Thursday! Artur PS Food will be provided. The time is noon. The place is probably NSH 1507 Karen (BTW, have you all met her already? If you have not please make a stop at NSH 3128 to introduce yourself when you have a chance) will let us know if we'd have to go someplace else. _______________________________________________ Research mailing list Research at autonlab.org https://www.autonlab.org/mailman/listinfo/research From awd at cs.cmu.edu Wed Dec 5 14:27:07 2007 From: awd at cs.cmu.edu (Artur Dubrawski) Date: Wed, 05 Dec 2007 14:27:07 -0500 Subject: [Research] [Fwd: Thesis Defense - Brent Bryan - 12/12/07] Message-ID: <1196882827.333.35.camel@localhost> The name of this speaker sounds strikingly familiar... Please mark your calendars. Artur -------------- next part -------------- An embedded message was scrubbed... From: Diane Stidle Subject: Thesis Defense - Brent Bryan - 12/12/07 Date: Wed, 05 Dec 2007 14:22:03 -0500 Size: 6506 URL: From krw at andrew.cmu.edu Wed Dec 5 15:10:00 2007 From: krw at andrew.cmu.edu (Karen Widmaier) Date: Wed, 5 Dec 2007 15:10:00 -0500 Subject: [Research] Room change for lab meeting on Thursday Dec 6th at noon References: <474F71E6.407@cs.cmu.edu> Message-ID: <00ae01c8377a$d0ba30a0$b7b10280@adm.ri.cmu.edu> Hello, Room change....Tomorrow's meeting will be held in Room 4201. Karen Karen Widmaier Robotics Institute NSH 3128 x7551 -----Original Message----- From: Karen Widmaier [mailto:krw at andrew.cmu.edu] Sent: Wednesday, December 05, 2007 10:34 AM To: 'research at autonlab.org' Subject: RE: [Research] lab meeting on Thursday Dec 6th at noon Hello, Tomorrow's meeting will be held in Room 2507 at Noon. Please send a reply to this email if you are able to attend. Thanks, Karen Karen Widmaier Robotics Institute NSH 3128 x7551 -----Original Message----- From: research-bounces at autonlab.org [mailto:research-bounces at autonlab.org] On Behalf Of Artur Dubrawski Sent: Thursday, November 29, 2007 9:14 PM To: research at autonlab.org Subject: [Research] lab meeting on Thursday Dec 6th at noon Hello, In some countries, December 6th is celebrated as St. Nicholas' day. St. Nick is a useful fellow as he brings each child a little something on that day (it's just a teaser, they get much much more stuff for Christmas). We are in luck this year as our little something will be an enlightening presentation by Brent titled "How far is the nearest coffee shop?". Does it sound as provocative to you as it does to me? Hope to see you all on Thursday! Artur PS Food will be provided. The time is noon. The place is probably NSH 1507 Karen (BTW, have you all met her already? If you have not please make a stop at NSH 3128 to introduce yourself when you have a chance) will let us know if we'd have to go someplace else. _______________________________________________ Research mailing list Research at autonlab.org https://www.autonlab.org/mailman/listinfo/research From schneide at cs.cmu.edu Tue Dec 11 18:31:37 2007 From: schneide at cs.cmu.edu (Jeff Schneider) Date: Tue, 11 Dec 2007 18:31:37 -0500 Subject: [Research] [Fwd: Reminder - Thesis Defense - Brent Bryan - 12/12/07] Message-ID: <475F1DD9.9010808@cs.cmu.edu> Please remember to come to Brent's thesis defense tomorrow morning at 9am and see another Auton Lab student get his PhD! Jeff. -------- Original Message -------- Thesis Defense, PhD Candidate: Brent Bryan Date: 12/12/07 Time: 9:00AM Place: 3305 Newell-Simon Hall Title: Actively Learning Specific Function Properties with Applications to Statistical Inference Abstract: Active learning techniques have previously been shown to be extremely effective for learning a target function over an entire parameter space based on a limited set of observations. However, in many cases, only a specific property of the target function needs to be learned. For instance, when discovering the boundary of a region --- such as the locations in which the wireless network strength is above some operable level, --- we are interested in learning only the level-set of the target function. While techniques that learn the entire target function over the parameter space can be used to detection specific properties of the target function (e.g. level-sets), methods that learn only the required properties can be significantly more efficient, especially as the dimensionality of the parameter space increases. These active learning algorithms have a natural application in many statistical inference techniques. For example, given a set of data and a physical model of the data, which is a function of several variables, a scientist is often interested in determining the ranges of the variables which are statistically supported by the data. We show that many frequentist statistical inference techniques can be reduced to a level-set detection problem or similar search of a property of the target function , and hence benefit from active learning algorithms which target specific properties. Using these active learning algorithms significantly decreases the number of experiments required to accurately detect the boundaries of the desired 1-\alpha confidence regions. Moreover, since computing the model of the data given the input parameters may be expensive (either computationally, or monetarily), such algorithms can facilitate analyses that were previously infeasible. We demonstrate the use of several statistical inference techniques combined with active learning algorithms on several cosmological data sets. The data sets vary in the dimensionality of the input parameters from two to eight. We show that naive algorithms, such as random sampling or grid based methods, are computationally infeasible for the higher dimensional data sets. However, our active learning techniques can efficiently detect the desired 1 - \alpha confidence regions. Moreover, the use of frequentist inference techniques allows us to easily perform additional inquiries, such as hypothetical restrictions on the parameters and joint analyses of all the cosmological data sets, with only a small number of additional experiments. More information can be found at: http://gs3636.sp.cs.cmu.edu/thesis/main.pdf Thesis Committee: Jeff Schneider (Chair) Christopher Genovese Christopher J Miller (NOAO/CTIO) Andrew Moore (Google) Robert C. Nichol (Univ. Portsmouth) Larry Wasserman -- ******************************************************************* Diane Stidle Business & Graduate Programs Manager Machine Learning Department School of Computer Science 4612 Wean Hall Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213-3891 Phone: 412-268-1299 Fax: 412-268-3431 Email: diane at cs.cmu.edu URL:http://www.ml.cmu.edu From sajid at cmu.edu Thu Dec 13 21:26:51 2007 From: sajid at cmu.edu (Sajid M. Siddiqi) Date: Thu, 13 Dec 2007 21:26:51 -0500 (EST) Subject: [Research] Thesis proposal tomorrow Message-ID: <4609.128.2.182.68.1197599211.squirrel@128.2.182.68> Dear Autonites, My thesis proposal is tomorrow (Friday) at 11:00 in Wean 4623, and you are all cordially invited to attend :) I am told that refreshments will be served. Sajid ---------------------------------------------------------------------------- Latent Variable and Predictive Models of Dynamical Systems Sajid Siddiqi Robotics Institute Carnegie Mellon University Place and Time WEH 4623 11:00 AM Abstract We propose to investigate new models and algorithms for inference, structure and parameter learning that extend our capabilities of modeling uncontrolled discrete-time dynamical systems. Our work is grounded in existing generative models for dynamical systems that are based on latent variable representations. Hidden Markov Models (HMMs) and Linear Dynamical Systems (LDSs) are popular choices for modeling dynamical systems because of their balance of simplicity and expressive power, and because of the existence of efficient inference and learning algorithms for these models. Recently proposed predictive models of dynamical systems, such as linear Predictive State Representations (PSRs) and Predictive Linear Gaussians (PLGs) have been shown to be equally powerful as (and often more compact than) HMMs and LDSs. Instead of modeling state by a latent variable, however, predictive models model the state of the dynamical system by a set of statistics defined on future observable events. This dependence on observable quantities and avoidance of latent variables makes it easier to learn consistent parameter settings and avoid local minima in predictive models, though they have other problems such as a paucity of well-developed learning algorithms. However, one interesting class of algorithms for learning models such as LDSs and PSRs is based on factoring matrices containing statistics about observations using techniques such as the singular value decomposition (SVD). This class of algorithms is especially popular in the control theory literature, under ! the area of subspace identification. A restriction of most current models is that they are restricted to either discrete, Gaussian, or mixture-of-Gaussian observation distributions. Recently, Wingate and Singh (2007) proposed a predictive model that aims to generalize PLGs to exponential family distributions. This allows us to exploit structure in the observations using exponential family graphical models. It also exposes us to problems of intractable inference, structure and parameter learning inherent in conventional algorithms for graphical models, necessitating the use of approximate inference techniques. Our goal is to formulate models and algorithms that unify disparate elements of this set of tools. The ultimate aim of this thesis is to devise predictive models that unify HMM-style and LDS-style models in a way that captures the advantages of both and generalize them to exponential families, and to investigate efficient and stable structure and parameter learning algorithms for these models based on matrix decomposition techniques. Further Details Thesis Committee * Geoffrey Gordon, Chair * Andrew Moore * Jeff Schneider * Zoubin Ghahramani, University of Cambridge * David Wingate, University of Michigan From awd at cs.cmu.edu Fri Dec 14 17:50:01 2007 From: awd at cs.cmu.edu (Artur Dubrawski) Date: Fri, 14 Dec 2007 17:50:01 -0500 Subject: [Research] recount of NRDM-related papers Message-ID: <47630899.7000102@cs.cmu.edu> Hello, Wold you please let me know if you authored or co-authored a paper which would refer to or use the over-the-counter data obtained through NRDM repository of the RODS lab (which probably means any OTC drug data you could find through the Lab). Thanks Artur