tutorial announcement--statistical physics and learning

seung@physics.att.com seung at physics.att.com
Thu Jun 30 17:20:15 EDT 1994


      =========================================================
      What does statistical physics have to say about learning?
      =========================================================

			Sunday, July 10, 1994
			 8:45 am to 12:15 pm
		      Milledoler Hall, room 100
                         Rutgers University
                      New Brunswick, New Jersey  

			Tutorial conducted by
		  Sebastian Seung and Michael Kearns
			AT&T Bell Laboratories
			Murray Hill, NJ 07974

	    Free of charge and open to the general public,
		  thanks to sponsorship from DIMACS.

	 Held in conjunction with the Eleventh International
       Conference on Machine Learning (ML94, July 11-13, 1994)
	  and the Seventh Annual Conference on Computational
	     Learning Theory (COLT94, July 12-15, 1994).

The study of learning has historically been the domain of
psychologists, statisticians, and computer scientists.  Statistical
physicists are the seemingly unlikely latecomers to the subject.  This
tutorial is an overview of the ideas they are now bringing to learning
theory, and of the relationship of these ideas to statistics and
computational learning theory.  We focus on the analysis of learning
curves, defined here as graphs of generalization error versus the
number of examples used in training.  We explain why supervised
learning from examples can lead to learning curves with a variety of
behaviors, some of which are very different from (though consistent
with) the Vapnik-Chervonenkis bounds.  This is illustrated most
dramatically by the presence of phase transitions in certain learning
models.  We discuss theoretical progress towards understanding two
puzzling empirical findings--that neural networks sometimes attain
good generalization with fewer examples than adjustable parameters,
and that generalization performance can be relatively insensitive to
the size of the hidden layer.  We conclude with a discussion of the
relationship of the statistical physics approach with that of the
Vapnik-Chervonenkis theory.

No prior knowledge of learning theory will be assumed.  This tutorial
is one of a set of DIMACS-sponsored tutorials that are free and open
to the general public.  Directions to Rutgers can be found in the
ML/COLT announcement, which is available via anonymous ftp from
www.cs.rutgers.edu in the directory "/pub/learning94".  Users of www
information servers such as mosaic can find the information at
"http://www.cs.rutgers.edu/pub/learning94/learning94.html".  Other
available information includes a campus map, and abstracts of all
workshops/tutorials.  Questions can be directed to
ml94 at cs.rutgers.edu, colt94 at research.att.com, or to Sebastian Seung at
908-582-7418 and seung at physics.att.com



More information about the Connectionists mailing list