Connectionists: differential log-likelihood

Marc Van Hulle marc.vanhulle at med.kuleuven.ac.be
Tue Aug 2 07:34:02 EDT 2005


Dear Connectionists,

Two papers on a new and unbiased information metric, called
differential log-likelihood, are available. Regular information metrics
such as Akaike's, BIC and GIC start from the log-likelihood but do not
produce an unbiased metric. This implies that the optimal case
corresponds to a minimum. The differential log-likelihood is optimal
when it reaches the zero value, and a zero-crossing is much easier to
locate than a minimum.

Van Hulle, M.M. (2005). Differential log-likelihood for evaluating and
learning Gaussian mixtures. Neural Computation, in press.

Abstract
We introduce a new unbiased metric for assessing the quality of density
estimation based on Gaussian mixtures, called differential
log-likelihood. As an application, we determine the optimal smoothness
and the optimal number of kernels in Gaussian mixtures. Furthermore, we
suggest a learning strategy for Gaussian mixture density estimation,
and compare its performance with log-likelihood maximization for a wide
range of real-world data sets.

http://134.58.34.60/~marc/NC11.pdf

--

Van Hulle, M.M. (2005). Mixture density modeling, Kullback-Leibler
divergence, and differential log-likelihood. Signal Processing (Special
Issue on Information Theoretic Signal Processing), J. Principe & D.
Erdogmus (Eds.), 85(5), 951-963.

http://134.58.34.60/~marc/SP.pdf

--

Marc M. Van Hulle
K.U.Leuven
Laboratorium voor Neurofysiologie
Faculteit Geneeskunde (Medical School)
Campus Gasthuisberg
Bus 801
Herestraat 49
B-3000 Leuven
Belgium
Phone: + 32 16 345961
Fax:   + 32 16 345960
E-mail: marc.vanhulle at med.kuleuven.ac.be
URL: http://simone.neuro.kuleuven.ac.be





More information about the Connectionists mailing list