EM and hierarchies of experts

Michael Jordan jordan at psyche.mit.edu
Tue Apr 27 18:19:19 EDT 1993


The following technical report has been placed in the neuroprose
directory, as jordan.hierarchies.ps.Z:



		 Hierarchical mixtures of experts 
		       and the EM algorithm

		        Michael I. Jordan
			       MIT

		 	 Robert A. Jacobs
		     University of Rochester

     We present a tree-structured architecture for supervised learning.
  The statistical model underlying the architecture is a hierarchical 
  mixture model in which both the mixture coefficients and the mixture 
  components are generalized linear models (GLIM's).  Learning is 
  treated as a maximum likelihood problem; in particular, we present an 
  Expectation-Maximization (EM) algorithm for adjusting the parameters 
  of the architecture.  We also develop an on-line learning algorithm 
  in which the parameters are updated incrementally.  Comparative 
  simulation results are presented in the robot dynamics domain.


------------------------------------------------
Here is an example of how to retrieve this file:
 
> ftp archive.cis.ohio-state.edu        (or ftp 128.146.8.52)
Connected to archive.cis.ohio-state.edu.
220 archive.cis.ohio-state.edu FTP server ready.
Name: anonymous
331 Guest login ok, send ident as password.
Password:<type your email address here>
230 Guest login ok, access restrictions apply.
ftp> binary
200 Type set to I.
ftp> cd pub/neuroprose
250 CWD command successful.
ftp> get jordan.hierarchies.ps.Z
200 PORT command successful.
150 Opening BINARY mode data connection for jordan.hierarchies.ps.Z
226 Transfer complete.
ftp> quit
221 Goodbye.
> uncompress jordan.hierarchies.ps.Z
> lpr jordan.hierarchies.ps
  



More information about the Connectionists mailing list