Paper available on variational Bayesian learning

Zoubin Ghahramani Zoubin at gatsby.ucl.ac.uk
Fri Nov 17 14:44:50 EST 2000


Dear Connectionists,

The following paper on variational approximations for Bayesian
learning with an application to linear dynamical systems will be
presented at the NIPS 2000 conference.  Comments are welcome.

	Gzipped postscript and PDF versions can be found at:

	http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips00beal.ps.gz
	http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips00beal.pdf

Zoubin Ghahramani and Matt Beal

----------------------------------------------------------------------

       Propagation algorithms for variational Bayesian learning

		Zoubin Ghahramani and Matthew J. Beal
		Gatsby Computational Neuroscience Unit
		      University College London

Variational approximations are becoming a widespread tool for Bayesian
learning of graphical models. We provide some theoretical results for
the variational updates in a very general family of
conjugate-exponential graphical models.  We show how the belief
propagation and the junction tree algorithms can be used in the
inference step of variational Bayesian learning.  Applying these
results to the Bayesian analysis of linear-Gaussian state-space models
we obtain a learning procedure that exploits the Kalman smoothing
propagation, while integrating over all model parameters. We
demonstrate how this can be used to infer the hidden state
dimensionality of the state-space model in a variety of synthetic
problems and one real high-dimensional data set.


A revised version of this paper will appear in Advances in Neural
Information Processing Systems 13, MIT Press.

----------------------------------------------------------------------





More information about the Connectionists mailing list