[Research] Lab meeting: Wednesday June 8th

Artur Dubrawski awd at cs.cmu.edu
Thu Jun 3 18:21:58 EDT 2010


Speaker: Yi Zhang
Title and Abstract: See below.
Time: Noon, as usual
Place: Karen W. will provide info
Pizza: Yes

See you all there!
Artur

---
Projection Penalties: Dimension Reduction without Loss
Yi Zhang and Jeff Schneider

Dimension reduction is popular for learning predictive models in 
high-dimensional spaces. It can highlight the relevant part of the 
feature space and avoid the curse of dimensionality. However, it can 
also be harmful because any reduction loses information. In this paper, 
we propose the \textit{projection penalty} framework to make use of 
dimension reduction without losing valuable information.

Reducing the feature space before learning predictive models can be 
viewed as restricting the model search to some parameter subspace. The 
idea of projection penalties is that instead of restricting the search 
to a parameter subspace, we can search in the full space but penalize 
the projection distance to this subspace. Dimension reduction is used to 
guide the search, rather than to restrict it.

We propose projection penalties for linear dimension reduction, and then 
generalize to kernel-based reduction and other nonlinear methods. We 
test projection penalties with various dimension reduction techniques in 
different prediction tasks, including principal component regression and 
partial least squares in regression tasks, kernel dimension reduction in 
face recognition, and latent topic modeling in text classification. 
Experimental results show that projection penalties are a more effective 
and reliable way to make use of dimension reduction techniques.



More information about the Autonlab-research mailing list