David Cohn to speak in Colorado Machine Learning series
Lorien Y. Pratt
lpratt at franklinite.Mines.Colorado.EDU
Wed Apr 7 12:57:52 EDT 1993
The spring, 1993
Colorado Machine Learning
Colloquium Series
presents:
Dr. David Cohn
Dept. of Brain & Cognitive Science
Massachusetts Inst. of Technology
Cambridge, MA 02139
Uncertainty-Based Queries in Neural Networks
*Thursday*, April 15, 1993
Room 110, Stratton Hall, on the CSM campus
5:30 pm
ABSTRACT
In many interesting learning problems, it is practical for a learner
to pick its own training data. Intuition and theory both suggest
that by properly picking where one's training data comes from, one
can greatly improve one's ability to generalize. I will consider
the problem of attempting to learn a map from points in a domain,
such as a geometric map or a set of state-action pairs, to some
``value,'' such as a classification or a next-state identifier.
I will assume that training data may be obtained by querying. That
is, we may specify a point x and call an oracle, or perform an
experiment to determine f(x), the value of the map at that point.
We wish to make queries that are ``optimally informative'' according
to some criterion, but there are many criteria to choose from, and
many are computationally intractible.
I will discuss my current research studying the suitability of a
querying criterion based on uncertainty in system parameters.
Under certain reasonable assumptions, one may efficiently compute
how much the uncertainty in system parameters will be reduced by
knowing f(x) for a specified x, and thus the ``information gain''
of querying it. This approach was introduced by Fedorov in 1972;
its utility for active data selection in neural networks was proposed
by MacKay in 1991.
I will discuss experiments performing gradient ascent on the
information gain of a query, and discuss the problems that are
involved in extending this approach to learning problems which only
allow restricted querying, such as navigation, exploration and
control.
Suggested background readings: Training Connectionist Networks with
Queries and Selective Sampling; Les Atlas, David Cohn and Richard
Ladner. Advances in Neural Information Processing 2, D. Touretzky
ed. Morgan Kaufmann, 1990, pp. 566-573. Constructing Hidden Units
using Examples and Queries; Eric B. Baum and Kevin J. Lang. Advances
in Neural Information Processing 3, R. Lippman, J. Moody and D.
Touretzky, eds. Morgan Kaufmann, 1991, pp. 904- 910. The Evidence
Framework applied to Classification Networks; David J. C. MacKay.
Neural Computation, Volume 4, number 5, 1992, pages 698-714.
These readings are available on reserve at the Arthur Lakes Library
at CSM. Ask for the reserve package for MACS570, subject: Cohn.
Non-students can check materials out on reserve by providing a
driver's license.
Open to the Public
Refreshments to be served at 5:00pm, prior to talk
For more information (including a schedule of all talks in this
series), contact: Dr. L. Y. Pratt, CSM Dept. of Mathematical and
Computer Sciences, lpratt at mines.colorado.edu, (303) 273-3878.
The speaker may be contacted at cohn at psyche.mit.edu.
Sponsored by:
THE CSM DEPARTMENTS OF MATHEMATICAL AND COMPUTER SCIENCES, GEOPHYSICS,
DIVISION OF ENGINEERING, AND
CRIS
The Center for Robotics and Intelligent Systems at the Colorado School
of Mines
More information about the Connectionists
mailing list