new tech report

Geoffrey Hinton hinton at ai.toronto.edu
Tue Jan 10 10:09:11 EST 1989


The following report can be obtained by sending an email request to
carol at ai.toronto.edu   If this fails try carol%ai.toronto.edu at relay.cs.net
Please do not send email to me about it (so don't use "reply" or "answer").


"Deterministic Boltzmann Learning Performs Steepest Descent in Weight-space."
				       
			      Geoffrey E. Hinton
			Department of Computer Science
			    University of Toronto
				       
			 Technical report CRG-TR-89-1

				   ABSTRACT

The Boltzmann machine learning procedure has been successfully applied in
deterministic networks of analog units that use a mean field approximation to
efficiently simulate a truly stochastic system {Peterson and Anderson, 1987}.
This type of ``deterministic Boltzmann machine'' (DBM) learns much faster than
the equivalent ``stochastic Boltzmann machine'' (SBM), but since the learning
procedure for DBM's is only based on an analogy with SBM's, there is no
existing proof that it performs gradient descent in any function, and it has
only been justified by simulations.  By using the appropriate interpretation
for the way in which a DBM represents the probability of an output vector
given an input vector, it is shown that the DBM performs steepest descent in
the same function as the original SBM, except at rare discontinuities.  A very
simple way of forcing the weights to become symmetrical is also described, and
this makes the DBM more biologically plausible than back-propagation.



More information about the Connectionists mailing list