NIPS*95 Workshop: Call for Participation
Rich Caruana
caruana+ at cs.cmu.edu
Fri Sep 15 09:23:26 EDT 1995
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
*-* POST-NIPS*95 WORKSHOP *-*
*-* December 1-2, 1995 *-*
*-* Vail, Colorado *-*
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
*-* CALL FOR PARTICIPATION *-*
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
TITLE: "Learning to Learn: Knowledge Consolidation
and Transfer in Inductive Systems"
ORGANIZERS: Jon Baxter, Rich Caruana, Tom Mitchell,
Lori Pratt, Danny Silver, Sebastian Thrun.
INVITED TALKS BY: Leo Breiman (Stanford, undecided)
Tom Mitchell (CMU)
Tomaso Poggio (MIT)
Noel Sharkey (Sheffield)
Jude Shavlik (Wisconsin)
WEB PAGES (for more information):
Our Workshop: http://www.cs.cmu.edu/afs/cs/usr/caruana/pub/transfer.html
NIPS*95 Info: http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/NIPS.html
WORKSHOP DESCRIPTION:
The power of tabula rasa learning is limited. Because of this,
interest is increasing in methods that capitalize on previously
acquired domain knowledge. Examples of these methods include:
o using symbolic domain theories to bias connectionist networks
o using unsupervised learning on a large corpus of unlabelled data
to learn features useful for subsequent supervised learning on a
smaller labelled corpus
o using models previously learned for other problems as a bias when
learning new, but related, problems
o using extra outputs on a connectionist network to bias the hidden
layer representation towards more predictive features
There are many different approaches: hints, knowledge-based artificial
neural nets (KBANN), explanation-based neural nets (EBNN), multitask
learning (MTL), knowledge consolidation, etc. What they all have in
common is the attempt to transfer knowledge from other sources to
benefit the current inductive task.
The goal of this workshop is to provide an opportunity for researchers
and practitioners to discuss problems and progress in knowledge
transfer in learning. We hope to identify research directions, debate
different theories and approaches, discover unifying principles, and
begin to start answering questions like:
o when will transfer help -- or hinder?
o what should be transferred?
o how should it be transferred?
o what are the benefits?
o in what domains is transfer most useful?
SUBMISSIONS:
We solicit presentations from anyone working in (or near):
o Sequential/incremental, compositional (learning by parts),
and parallel learning
o Task knowledge transfer (symbolic-neural, neural-neural)
o Adaptation of learning algorithms based on prior learning
o Learning domain-specific inductive bias
o Combining predictions made for related tasks from one domain
o Combining supervised learning (where the goal is to learn one feature
from the other features) with unsupervised learning (where the goal is
to learn every feature from all the other features)
o Combining symbolic and connectionist methods via transfer
o Fundamental problems/issues in learning to learn
o Theoretical models of learning to learn
o Cognitive models of, or evidence for, transfer in learning
Please send a short (one page or less) description of what you want to
present to one of the co-chairs below by Oct 15. Email is preferred.
We'll select from the submissions and publish a workshop schedule by
Nov 1. Preference will be given to submissions that are likely to
generate debate and that go beyond summarizing prior published work by
raising important issues or suggesting directions for future work.
Suggestions for moderator or panel-led discussions (e.g., sequential
vs. parallel transfer) are also encouraged. We plan to run the
workshop as a workshop, not as a mini conference, so be daring! We
look forward to your submission.
Rich Caruana Daniel L. Silver
School of Computer Science Department of Computer Science
Carnegie Mellon University Middlesex College
5000 Forbes Avenue University of Western Ontario
Pittsburgh, PA 15213, USA London, Ontario, Canada N6A 3K7
email: caruana at cs.cmu.edu email: dsilver at csd.uwo.ca
ph: (412) 268-3043 ph: (519) 473-6168
fax: (412) 268-5576 fax: (519) 661-3515
See you in Colorado!
More information about the Connectionists
mailing list