Connectionists: CFP: NIPS 2011 workshop on Challenges in Learning Hierarchical Models: Transfer Learning and Optimization
Ruslan Salakhutdinov
rsalakhu at cs.toronto.edu
Wed Aug 31 19:48:55 EDT 2011
CALL FOR CONTRIBUTIONS
NIPS 2011 workshop on Challenges in Learning Hierarchical Models: Transfer
Learning and Optimization
Melia Sierra Nevada & Melia Sol y Nieve, Sierra Nevada, Spain
https://sites.google.com/site/nips2011workshop/home
Important Dates:
----------------
Deadline for submissions: October 21, 2011
Notification of acceptance: October 28, 2011
Overview:
----------------
The ability to learn abstract representations that support transfer to
novel but related tasks lies at the core of solving many AI related tasks,
including visual object recognition, information retrieval, speech
perception, and language understanding. Hierarchical models that support
inferences at multiple levels have been developed and argued as among the
most promising candidates for achieving such goal. An important property
of these models is that they can extract complex statistical dependencies
from high-dimensional sensory input and efficiently learn latent variables
by re-using and combining intermediate concepts, allowing these models to
generalize well across a wide variety of tasks.
In the past few years, researchers across many different communities, from
applied statistics to engineering, computer science and neuroscience, have
proposed several hierarchical models that are capable of extracting
useful, high-level structured representations. The learned representations
have been shown to give promising results for solving a multitude of novel
learning tasks. A few notable examples of such models include Deep Belief
Networks, Deep Boltzmann Machines, sparse coding-based methods,
nonparametric and parametric hierarchical Bayesian models.
Despite recent successes, many existing hierarchical models are still far
from being able to represent, identify and learn the wide variety of
possible patterns and structure in real-world data. Existing models can
not cope with new tasks for which they have not been specifically trained.
Even when applied to related tasks, trained systems often display unstable
behavior. Furthermore, massive volumes of training data (e.g., data
transferred between tasks) and high-dimensional input spaces pose
challenging questions on how to effectively train the deep hierarchical
models. The recent availability of large scale datasets (like ImageNet for
visual object recognition or Wall Street Journal for large vocabulary
speech recognition), the continuous advances in optimization methods, and
the availability of cluster computing have drastically changed the working
scenario, calling for a re-assessment of the strengths and weaknesses of
many existing optimization strategies.
The aim of this workshop is to bring together researchers working on such
hierarchical models to discuss two important challenges: the ability to
perform transfer learning and the best strategies to optimize these
systems on large scale problems. These problems are "large" in terms of
input dimensionality (in the order of millions), number of training
samples (in the order of 100 millions or more) and number of categories
(in the order of several tens of thousands).
Submission Instructions:
------------------------
We solicit submissions of unpublished research papers. Papers must have at
most 6 pages (even in the form of extended abstracts), and must satisfy
the formatting instructions of the NIPS 2011 call for papers. Submissions
should include the title, authors' names, institutions and email addresses
and should be sent in PDF or PS file format by email to
transflearn.optim.wnips2011 at gmail.com.
Submissions will be reviewed by the organizing committee on the basis of
relevance, significance, technical quality, and clarity. Selected
submissions may be accepted either as an oral presentation or as a poster
presentation: there will be a limited number of oral presentations.
We encourage submissions with a particular emphasis on:
* transfer learning
* one-shot learning
* learning hierarchical models
* scalability of hierarchical models at training and test time
* deterministic and stochastic optimization for hierarchical models
* parallel computing
* theoretical foundations of transfer learning
* applications of hierarchical models to large scale datasets
Organizers
----------
Quoc V. Le, Computer Science Department, Stanford University
Marc'Aurelio Ranzato, Google Inc
Ruslan Salakhutdinov, Department of Statistics, University of Toronto
Andrew Ng, Computer Science Department, Stanford University
Josh Tenenbaum, Department of Brain and Cognitive Sciences, MIT
https://sites.google.com/site/nips2011workshop/home
More information about the Connectionists
mailing list