Fwd: Reminder - Thesis Defense - 8/3/15 - Ina Fiterau - Discovering Compact and Informative Structures through Data Partitioning

Artur Dubrawski awd at cs.cmu.edu
Sun Aug 2 20:26:10 EDT 2015


Dear Autonians,

If you are available please come and cheer Ina on her path to completion.

Her thesis defense will take place this Monday at 10am in Gates Hall 
room 6115.

See you there!
Artur


-------- Forwarded Message --------
Subject: 	Reminder - Thesis Defense - 8/3/15 - Ina Fiterau - Discovering 
Compact and Informative Structures through Data Partitioning
Date: 	Sun, 02 Aug 2015 15:48:43 -0400
From: 	Diane Stidle <diane at cs.cmu.edu>
To: 	ML-SEMINAR at cs.cmu.edu, Andreas Krause <krausea at ethz.ch>



Thesis Defense

Date: 8/3/15
Time: 10:00am
Place: 6115 GHC
PhD Candidate: Madalina Fiterau-Brostean

Title: Discovering Compact and Informative Structures through Data 
Partitioning

Abstract:
In this thesis, we have shown that it is possible to identify 
low-dimensional structures in complex high-dimensional data, if such 
structures exist. We have leveraged these underlying structures to 
construct compact interpretable models for various machine learning 
tasks that benefit practical applications.

To start with, I will formalize Informative Projection Recovery, the 
problem of extracting a small set of low-dimensional projections of data 
that jointly support an accurate model for a given learning task. Our 
solution to this problem is a regression-based algorithm that identifies 
informative projections by optimizing over a matrix of point-wise loss 
estimators. It generalizes to multiple types of machine learning 
problems, offering solutions to classification, clustering, regression, 
and active learning tasks. Experiments show that our method can discover 
and leverage low-dimensional structures in data, yielding accurate and 
compact models. Our method is particularly useful in applications in 
which expert assessment of the results is of the essence, such as 
classification tasks in the healthcare domain.

In the second part of the talk, I will describe back-propagation 
forests, a new type of ensemble that achieves improved accuracy over 
existing ensemble classifiers such as random forests classifiers or 
alternating decision forests. Back-propagation (BP) trees use soft 
splits, such that a sample is probabilistically assigned to all the 
leaves. Also, the leaves assign a distribution across the labels. The 
splitting parameters are obtained through SGD by optimizing the log loss 
over the entire tree, which is a non-convex objective. The probability 
distribution over the leaves is computed exactly by maximizing a log 
concave procedure. In addition, I will present several proposed 
approaches for the use of BP forests within the context of compact 
informative structure discovery. We have successfully used BP forests to 
increase the performance of deep belief network architectures, with 
results improving over the state of the art on vision datasets.

Thesis Committee:
Artur Dubrawski, Chair
Geoff Gordon
Alex Smola
Andreas Krause (ETH Zurich)

-- 
Diane Stidle
Graduate Programs Manager
Machine Learning Department
Carnegie Mellon University
diane at cs.cmu.edu
412-268-1299



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/autonlab-research/attachments/20150802/b7ccc6ec/attachment.html>


More information about the Autonlab-research mailing list