tutorial slides available on graphical models, and on priors

Wray Buntine wray at ultimode.com
Sun Oct 20 09:58:06 EDT 1996


The following slides were prepared for the NATO Workshop on Learning in
Graphical Models, just held in Erice, Italy, Sept. 1996.
Actually, these are *revised* from the Erice workshop so those in attendance
might like to update too.

They are available over WWW but not yet available via FTP.
You'll find them at my web site:
        http://WWW.Ultimode.com/~wray/refs.html#tutes
Also, please note my new location and email address, given at the end.

The graphical models and exponential family talk contains an introduction to
lots of learning algorithms using graphical models.  Included is an analysis
with proofs of the much-hyped mean field algorithm in its general case for
the exponential family (as you might have guessed, mean field is simple once
you strip away the physics), and lots more.  This talk also contains how I
believe Gibbs, EM, k-means, and deterministic annealing should be taught (as
variants of one another).

   Computation with the Exponential Family and Graphical Models   
   ============================================================

   This tutorial plays two roles: to illustrate how graphical models can be
   used to present models and algorithms for data analysis, and to present 
   computational methods based on the Exponential Family, a central concept 
   for computational data analysis.

   The Exponential Family is the most important family of probability
   distributions.  It includes the Gaussian, the binomial, the Poisson, and
   others. It has unique computational properties: all fast algorithms for data
   analysis, to my knowledge, have some version of the exponential family at
   their core.  Every student of data analysis, regardless of their discipline
   (computer science, neural nets, pattern recognition, etc.) should therefore
   understand the Exponential Family and the key algorithms which are based on
   them.  This tutorial presents the Exponential Family and algorithms using
   graphical models:  Bayesian networks and Markov networks (directed and
   undirected graphs). These graphical models represent independence and
   therefore neatly display many of the essential details of the algorithms and
   models based around the exponential family. Algorithms discussed are the
   Expectation-Maximization (EM) algorithm, Gibbs sampling, k-means,
   deterministic annealing, Scoring, Iterative Reweighted Least Squares (IRLS),
   Mean Field, and Iterative Proportional Fitting (IPF). Connections 
   between these different algorithms are given, and the general formulations 
   presented, in most cases, are readily adapted to arbitrary Exponential 
   Family distributions.


The priors tutorial was a *major* revision from my previous version.
Those with the older version should update!

   Prior Probabilities  
   ===================
        
   Prior probabilities are the center of most of the old controversies
   surrounding Bayesian statistics. While the Bayesian/Classical
   distinctions in statistics are becoming blurred, priors remain a problem,
   largely because of a lack of good tutorial material and the unfortunate
   residue of previous misunderstandings. Methods for developing and assessing
   priors are now routinely used by experienced practitioners. This tutorial
   will review some of the issues, presenting a view that incorporates
   decision theory and multi-agent reasoning. First, some perspectives are
   given: applications, theory, parameters and models, and the role of the
   decision being made.  Then, basic principles are presented: Jaynes'
   Principle of Invariance is a generalization of Laplace's Principle of
   Indifference that allows a specification of ignorance to be converted into
   a prior.  A prior for non-linear regression is developed, and the important
   role of a "measure", over-fitting, and priors on multinomials are
   presented.  Issues such as subjectivity versus objectivity, Occam's razor,
   various paradoxes, maximum entropy methods, and the so-called
   non-informative & reference priors are also presented.

   A bibliography is included.


Wray Buntine                                   
============
Consultant to industry and NASA,
and Visiting Scientist at EECS, UC Berkeley working on probabilistic
methods in computer-aided design of ICs with Dr. Andy Mayer and Prof. 
Richard Newton.

Ultimode Systems, LLC			Phone:  (415) 324 3447
555 Bryant Str. #186			Email:  wray at ultimode.com
Palo Alto, 94301			http://WWW.Ultimode.com/~wray/




More information about the Connectionists mailing list