Intensive Tutorial: Learning Methods for Prediction, Classification

Marney Smyth marney at ai.mit.edu
Wed Jul 24 19:41:44 EDT 1996


        **************************************************************
        ***                                                        ***
        ***    Learning Methods for Prediction, Classification,    ***
	***       Novelty Detection and Time Series Analysis       ***
        ***                                                        ***
        ***          Cambridge, MA, September 20-21, 1996          ***
        ***          Los Angeles, CA, December 14-15, 1996         ***
        ***                                                        ***
     	***	   Geoffrey Hinton, University of Toronto	   ***
     	***      Michael Jordan, Massachusetts Inst. of Tech.      ***
        ***                                                        ***
        **************************************************************


A two-day intensive Tutorial on Advanced Learning Methods will be held 
on September 20 and 21, 1996, at the Royal Sonesta Hotel, Cambridge, MA, 
and on December 14 and 15, 1996, at Lowe's Hotel, Santa Monica, CA.
Space is available for up to 50 participants for each course.

The course will provide an in-depth discussion of the large collection 
of new tools that have become available in recent years for developing 
autonomous learning systems and for aiding in the analysis of complex 
multivariate data.  These tools include neural networks, hidden Markov 
models, belief networks, decision trees, memory-based methods, as well 
as increasingly sophisticated combinations of these architectures.  
Applications include prediction, classification, fault detection, 
time series analysis, diagnosis, optimization, system identification 
and control, exploratory data analysis and many other problems in
statistics, machine learning and data mining.

The course will be devoted equally to the conceptual foundations of 
recent developments in machine learning and to the deployment of these 
tools in applied settings.  Case studies will be described to show how 
learning systems can be developed in real-world settings.  Architectures 
and algorithms will be presented in some detail, but with a minimum of 
mathematical formalism and with a focus on intuitive understanding.  
Emphasis will be placed on using machine methods as tools that can 
be combined to solve the problem at hand.

WHO SHOULD ATTEND THIS COURSE?

The course is intended for engineers, data analysts, scientists,
managers and others who would like to understand the basic principles
underlying learning systems.  The focus will be on neural network models 
and related graphical models such as mixture models, hidden Markov 
models, Kalman filters and belief networks.  No previous exposure to 
machine learning algorithms is necessary although a degree in engineering 
or science (or equivalent experience) is desirable.  Those attending 
can expect to gain an understanding of the current state-of-the-art 
in machine learning and be in a position to make informed decisions 
about whether this technology is relevant to specific problems in 
their area of interest.

COURSE OUTLINE

Overview of learning systems; LMS, perceptrons and support vectors; 
generalized linear models; multilayer networks; recurrent networks; 
weight decay, regularization and committees; optimization methods; 
active learning; applications to prediction, classification and control

Graphical models: Markov random fields and Bayesian belief networks;
junction trees and probabilistic message passing; calculating most 
probable configurations; Boltzmann machines; influence diagrams; 
structure learning algorithms; applications to diagnosis, density 
estimation, novelty detection and sensitivity analysis

Clustering; mixture models; mixtures of experts models; the EM 
algorithm; decision trees; hidden Markov models; variations on 
hidden Markov models; applications to prediction, classification 
and time series modeling

Subspace methods; mixtures of principal component modules; factor 
analysis and its relation to PCA; Kalman filtering; switching 
mixtures of Kalman filters; tree-structured Kalman filters; 
applications to novelty detection and system identification

Approximate methods: sampling methods, variational methods; 
graphical models with sigmoid units and noisy-OR units; factorial 
HMMs; the Helmholtz machine; computationally efficient upper 
and lower bounds for graphical models

REGISTRATION

Standard Registration: $700

Student Registration:  $400

Registration fee includes course materials, breakfast, coffee breaks, 
and lunch on Saturday.

Those interested in participating should return the completed
Registration Form and Fee as soon as possible, as the total number of
places is limited by the size of the venue.


ADDITIONAL INFORMATION

A registration form is available from the course's WWW page at 

 http://www.ai.mit.edu/projects/cbcl/web-pis/jordan/course/index.html

 Marney Smyth
 CBCL at MIT
 E25-201
 45 Carleton Street
 Cambridge, MA 02142
 USA
     
 Phone:  617 253-0547
 Fax:    617 253-2964
 E-mail: marney at ai.mit.edu




More information about the Connectionists mailing list