Learning with Probabilistic Representations (Machine Learning Special Issue)

Tom Dietterich tgd at CS.ORST.EDU
Tue Dec 30 20:09:13 EST 1997


Machine Learning has published the following special issue on 

LEARNING WITH PROBABILISTIC REPRESENTATIONS 
Guest Editors: Pat Langley, Gregory M. Provan, and Padhraic Smyth

Learning with Probabilistic Representations
        Pat Langley, Gregory Provan, and Padhraic Smyth

On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
        Pedro Domingos and Michael Pazzani

Bayesian Network Classifiers
        Nir Friedman, Dan Geiger, and Moises Goldszmidt

The Sample Complexity of Learning Fixed-Structure Bayesian Networks
        Sanjoy Dasgupta

Efficient Approximations for the Marginal Likelihood of Bayesian
Networks with Hidden Variables
        David Maxwell Chickering and David Heckerman

Adaptive Probabilistic Networks with Hidden Variables
        John Binder, Daphne Koller, Stuart Russell, and Keiji Kanazawa

Factorial Hidden Markov Models
        Zoubin Ghahramani and Michael I. Jordan

Predicting Protein Secondary Structure using Stochastic Tree Grammars
        Naoki Abe and Hiroshi Mamitsuka

For more information, see http://www.cs.orst.edu/~tgd/mlj

--Tom


More information about the Connectionists mailing list