Connectionists: nonlinear dimensionality reduction
Sayan Mukherjee
sayan at stat.duke.edu
Fri Mar 16 11:03:23 EDT 2007
Dear colleagues:
We have been developing in a series of papers
the use of gradient estimates for variable
selection, feature selection, as well as nonlinear
dimensionality reduction. All in the supervised
setting. This is related to the work in statistics
on Sliced Inverse Regression (SIR) and Sliced
Average Variance Estimation (SAVE) but for
high-dimensional problems. It is also related to
work on learning manifolds. An extension
to diffusion maps is in progress.
The papers are
Learning gradients and feature selection on manifolds:
http://ftp.stat.duke.edu/WorkingPapers/06-20.pdf
Estimation of Gradients and Coordinate Covariation in Classification
http://jmlr.csail.mit.edu/papers/v7/mukherjee06b.html
Learning Coordinate Covariances via Gradients
http://jmlr.csail.mit.edu/papers/v7/mukherjee06a.html
Matlab code can be downloaded at
http://www.stat.duke.edu/~sayan/soft.html
cheers,
sayan
--
Dr. Sayan Mukherjee phone: 919 668-4747
Assistant Professor fax: 919 668-0795
Computational Biology, Statistics, www.stat.duke.edu/~sayan
& Computer Science
Duke Scholar in Genomic Medicine
Duke University
More information about the Connectionists
mailing list