Connectionists: Gaussian Process Classification Paper Available
Mark Girolami
girolami at dcs.gla.ac.uk
Thu Nov 17 09:22:26 EST 2005
The following paper may be of interest to those working on devising
efficient methods of approximate Bayesian inference for classification
over multiple classes with Gaussian Processes.
Title: Variational Bayesian Multinomial Probit Regression with Gaussian
Process Priors.
Authors: Mark Girolami and Simon Rogers, Department of Computing
Science, University of Glasgow
Abstract
It is well known in the statistics literature that augmenting binary and
polychotomous response models with Gaussian latent variables enables
exact Bayesian analysis via Gibbs sampling from the parameter posterior.
By adopting such a data augmentation strategy, dispensing with priors
over regression coefficients in favour of Gaussian Process (GP) priors
over functions, and employing variational approximations to the full
posterior we obtain efficient computational methods for Gaussian Process
classification in the multi-class setting. The model augmentation with
additional latent variables ensures full a posteriori class coupling
whilst retaining the simple a priori independent GP covariance structure
from which sparse approximations, such as multi-class Informative Vector
Machines (IVM), emerge in a very natural and straightforward manner.
This is the first time that a fully Variational Bayesian treatment for
multi-class GP classification has been developed without having to
resort to additional explicit approximations to the non-Gaussian
likelihood term. Empirical comparisons with exact analysis via MCMC and
Laplace approximations illustrate the utility of the variational
approximation as a computationally economic alternative to full MCMC and
it is shown to be more accurate than the Laplace approximation.
Available from:
http://www.dcs.gla.ac.uk/people/personal/girolami/pubs_2005/VBGP/index.h
tm
To Appear Neural Computation, MIT Press
More information about the Connectionists
mailing list