Modern Regression and Classification Course in Boston

Marney Smyth marney at ai.mit.edu
Wed Sep 25 21:32:31 EDT 1996


        ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
        +++                                                        +++
        +++          Modern Regression and Classification          +++
        +++       Widely Applicable Statistical Methods for        +++
        +++                 Modeling and Prediction                +++
        +++                                                        +++
        +++          Cambridge, MA, December 9 - 10, 1996          +++
        +++                                                        +++
        +++            Trevor Hastie, Stanford University          +++
        +++          Rob Tibshirani, University of Toronto         +++
        +++                                                        +++
        ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++



This two-day course will give a detailed overview of statistical
models for regression and classification. Known as machine-learning in
computer science and artificial intelligence, and pattern recognition
in engineering, this is a hot field with powerful applications in
science, industry and finance.

The course covers a wide range of models, from linear regression
through various classes of more flexible models, to fully
nonparametric regression models, both for the regression problem and
for classification. Although a firm theoretical motivation will be
presented, the emphasis will be on practical applications and
implementations. The course will include many examples and case
studies, and participants should leave the course well-armed to tackle
real problems with realistic tools. The instructors are at the
forefront in research in this area.

After a brief overview of linear regression tools, methods for
one-dimensional and multi-dimensional smoothing are presented, as well
as techniques that assume a specific structure for the regression
function. These include splines, wavelets, additive models, MARS
(multivariate adaptive regression splines), projection pursuit
regression, neural networks and regression trees.

The same hierarchy of techniques is available for classification
problems. Classical tools such as linear discriminant analysis and
logistic regression can be enriched to account for nonlinearities and
interactions. Generalized additive models and flexible discriminant
analysis, neural networks and radial basis functions, classification
trees and kernel estimates are all such generalizations. Other
specialized techniques for classification including nearest-neighbor
rules and learning vector quantization will also be covered. 

Apart from describing these techniques and their applications to a
wide range of problems, the course will also cover model selection
techniques, such as cross-validation and the bootstrap, and diagnostic
techniques for model assessment.

Software for these techniques will be illustrated, and a comprehensive
set of course notes will be provided to each attendee.

Additional information is available at the Website:

http://playfair.stanford.edu/~trevor/mrc.html




			    COURSE OUTLINE

			      DAY ONE:

Overview of regression methods: Linear regression models and least
squares. Ridge regression and the lasso. Flexible linear models
and basis function methods. linear and nonlinear smoothers; kernels,
splines, and wavelets. Bias/variance tradeoff- cross-validation and
bootstrap. Smoothing parameters and effective number of parameters.
Surface smoothers.

			       ++++++++

Structured Nonparametric Regression: Problems with high dimensional
smoothing. Structured high-dimensional regression: additive models.
project pursuit regression. CART, MARS.  radial basis functions.
neural networks. applications to time series forecasting.

			       DAY TWO:

Classification: Statistical decision theory and classification rules.
Linear procedures: Discriminant Analysis. Logistics regression.
Quadratic discriminant analysis, parametric models. Nearest neighbor
classification, K-means and LVQ. Adaptive nearest neighbor methods.

			       ++++++++

Nonparametric classification: Classification trees: CART.
Flexible/penalized discriminant analysis. Multiple logistic regression
models and neural networks. Kernel methods.



			   THE INSTRUCTORS

Professor Trevor Hastie of the Statistics and Biostatistics
Departments at Stanford University was formerly a member of the
Statistics and Data Analysis Research group AT & T Bell
Laboratories. He co-authored with Tibshirani the monograph Generalized
Additive Models (1990) published by Chapman and Hall, and has many
research articles in the area of nonparametric regression and
classification. He also co-edited the Wadsworth book Statistical
Models in S (1991) with John Chambers.

Professor Robert Tibshirani of the Statistics and Biostatistics
departments at University of Toronto is the most recent recipient of
the COPSS award - an award given jointly by all the leading
statistical societies to the most outstanding statistician under the
age of 40.  He also has many research articles on nonparametric
regression and classification. With Bradley Efron he co-authored the
best-selling text An Introduction to the Bootstrap in 1993, and has
been an active researcher on bootstrap technology for the past 11
years. 

Quotes from previous participants:

"... the best presentation by professional statisticians I have ever
had the pleasure of attending"

".. superior to most courses in all respects."




Both Prof. Hastie and Prof. Tibshirani are actively involved in
research in modern regression and classification and are well-known
not only in the statistics community but in the machine-learning and
neural network fields as well. The have given many short courses
together on classification and regression procedures to a wide variety
of academic, government and industrial audiences. These include the
American Statistical Association and Interface meetings, NATO ASI
Neural Networks and Statistics workshop, AI and Statistics, and the
Canadian Statistical Society meetings. 




BOSTON COURSE: December 9-10, 1996 at the 

HYATT REGENCY HOTEL, CAMBRIDGE, MASSACHUSETTS.
 


PRICE: $750 per attendee before November 11, 1996. Full time
registered students receive a 40% discount (i.e. $450). Cancellation
fee is $100 after October 29, 1996.  Registration fee after November
11, 1996 is $950 (Students $530).  Attendance is limited to the first
60 applicants, so sign up soon!  These courses fill up quickly.


HOTEL ACCOMMODATION

The Hyatt Regency Hotel offers special accommodation rates for course
participants ($139 per night). Contact the hotel directly - 

The Hyatt Regency Hotel, 575 Memorial Drive, Cambridge, MA 02139.
Phone : 617 4912-1234

Alternative hotel accommodation information at MRC WebSite:

http://playfair.stanford.edu/~trevor/mrc.html



COURSE REGISTRATION


TO REGISTER: Detach and fill in the Registration Form below:

		 Modern Regression and Classification
	Widely applicable methods for modeling and prediction


		    December 9 - December 10, 1996
		     Cambridge, Massachusetts USA



         Please complete this form (type or print)


Name   ___________________________________________________
       Last                 First                   Middle

Firm or Institution  ______________________________________

Mailing Address (for receipt)     _________________________


__________________________________________________________


__________________________________________________________


__________________________________________________________
Country                    Phone                      FAX



__________________________________________________________
email address 



__________________________________________________________
Credit card # (if payment by credit card)      Expiration Date

(Lunch Menu  -  tick as appropriate):

 
___ Vegetarian                           ___ Non-Vegetarian



Fee payment must be made by MONEY ORDER, PERSONAL CHECK, VISA or
MASTERCARD.  All amounts must in US dollar figures. Make fee payable
to Prof. Trevor Hastie. Mail it, together with this completed
Registration Form to:

Marney Smyth, 
MIT Press 
E39-311 
55 Hayward Street, 
Cambridge, MA 02142 USA

ALL CREDIT CARD REGISTRATIONS MUST INCLUDE BOTH CARD NUMBER AND
EXPIRATION DATE.


DEADLINE: Registration before December 2, 1996. DO NOT SEND CASH.

Registration fee includes Course Materials, coffee breaks,
and lunch both days.

If you have further questions, email to marney at ai.mit.edu







More information about the Connectionists mailing list