Thesis on Density Estimation
Dirk Ormoneit
ormoneit at stat.Stanford.EDU
Mon Sep 28 15:50:21 EDT 1998
Hi,
My PhD thesis on probability estimating neural networks
is now available from Shaker Verlag / Aachen (ISBN 3-8265-3723-8):
http://www.shaker.de/Online-Gesamtkatalog/Details.idc?ISBN=3-8265-3723-8
For more information on specific topics touched upon in the abstract below,
see also my Stanford homepage:
http://www-stat.stanford.edu/~ormoneit/
Best,
Dirk
===========================================================================
PROBABILITY ESTIMATING NEURAL NETWORKS
Dirk Ormoneit
Fakult"at f"ur Informatik
Technische Universit"at M"unchen
A central problem of machine learning is the identification of
probability distributions that govern uncertain environments.
A suitable concept for ``learning'' probability distributions from
sample data may be derived by employing neural networks. In this work
I discuss several neural architectures that can be used to learn
various kinds of probabilistic dependencies.
After briefly reviewing essential concepts from neural learning and
probability theory, I provide an in-depth discussion of neural and other
approaches to conditional density estimation. In particular, I
introduce the ``Recurrent Conditional Density Estimation Network (RCDEN)'',
a neural architecture which is particularly well-suited to
identify the transition densities of time-series in the presence of
latent variables. As a practical example, I consider the conditional
densities of German stock market returns and compare the results of the
RCDEN to those of ARCH-type models.
A second focus of the work is on the estimation of multivariate densities
by means of Gaussian mixture models. A severe problem for the practical
application of Gaussian mixture estimates is their strong tendency to
``overfit'' the training data. In my work I compare three regularization
procedures that can be applied to deal with this problem.
The first method consists of deriving EM
update rules for maximum penalized likelihood estimation. In the second
approach, the ``full'' Bayesian inference is approximated by means of
a Markov chain Monte Carlo algorithm. Finally, I apply ensemble averaging
to regularize the Gaussian mixture estimates, most prominently a variant of
the popular ``bagging'' algorithm. The three approaches are compared in
extensive experiments that involve the construction of Bayes classifiers
from the density estimates. The work concludes with considerations on
several practical applications of density estimating neural networks in
time-series analysis, data transmission, and optimal planning in
multi-agent environments.
--------------------------------------------
Dirk Ormoneit
Department of Statistics, Room 206
Stanford University
Stanford, CA 94305-4065
ph.: (650) 725-6148
fax: (650) 725-8977
ormoneit at stat.stanford.edu
http://www-stat.stanford.edu/~ormoneit/
More information about the Connectionists
mailing list