technical report

Michael Jordan jordan at psyche.mit.edu
Mon Oct 4 15:36:46 EDT 1993


The following paper is now available on the neuroprose
archive as "jordan.convergence.ps.Z".


	   Convergence results for the EM approach to
	         mixtures of experts architectures

		       Michael I. Jordan
			     Lei Xu
	   Department of Brain and Cognitive Sciences
               Massachusetts Institute of Technology

     The Expectation-Maximization (EM) algorithm is an iterative
  approach to maximum likelihood parameter estimation.  Jordan 
  and Jacobs (1993) recently proposed an EM algorithm for the 
  mixture of experts architecture of Jacobs, Jordan, Nowlan and 
  Hinton (1991) and the hierarchical mixture of experts architecture 
  of Jordan and Jacobs (1992).  They showed empirically that the 
  EM algorithm for these architectures yields significantly faster 
  convergence than gradient ascent.  In the current paper we provide 
  a theoretical analysis of this algorithm.  We show that the algorithm 
  can be regarded as a variable metric algorithm with its searching 
  direction having a positive projection on the gradient of the 
  log likelihood.  We also analyze the convergence of the algorithm 
  and provide an explicit expression for the convergence rate.  
  In addition, we describe an acceleration technique that yields 
  a significant speedup in simulation experiments.





More information about the Connectionists mailing list