Paper on "A new view of the EM algorithm"

Radford Neal radford at cs.toronto.edu
Wed Feb 17 15:15:58 EST 1993


The following paper has been placed in the neuroprose archive, as the
file 'neal.em.ps.Z':


             A NEW VIEW OF THE EM ALGORITHM THAT JUSTIFIES 
                    INCREMENTAL AND OTHER VARIANTS
  
                Radford M. Neal and Geoffrey E. Hinton

                    Department of Computer Science 
                         University of Toronto 

  We present a new view of the EM algorithm for maximum likelihood
  estimation in situations with unobserved variables.  In this view,
  both the E and the M steps of the algorithm are seen as maximizing a
  joint function of the model parameters and of the distribution over
  unobserved variables.  From this perspective, it is easy to justify an
  incremental variant of the algorithm in which the distribution for
  only one of the unobserved variables is recalculated in each E step.
  This variant is shown empirically to give faster convergence in a
  mixture estimation problem.  A wide range of other variant algorithms
  are also seen to be possible.


The PostScript for this paper may be retrieved in the usual fashion:

  unix> ftp archive.cis.ohio-state.edu
  (log in as user 'anonymous', e-mail address as password)
  ftp> binary
  ftp> cd pub/neuroprose
  ftp> get neal.em.ps.Z
  ftp> quit
  unix> uncompress neal.em.ps.Z
  unix> lpr neal.em.ps (or however you print PostScript files)

Many thanks to Jordan Pollack for providing this service!

  Radford Neal



More information about the Connectionists mailing list