batch-mode parallel implementations

Tom English english at sun1.cs.ttu.edu
Mon Oct 21 17:12:09 EDT 1991


With regard to my earlier posting on problems I encountered in applying
Quickprop, Scott Fahlman has replied:

  Note that it is OK to switch from one training set to another when using
  Quickprop, but that every time you change the training set you *must* zero
  out the prev-slopes and delta vectors.

  If you want to get any benefit from quickprop, you have to run each
  distinct training set for at least a few cycles.

  If you were aware of all that (it's unclear from your message)....

Well, I was not aware of what others were doing in practice.  Scott's
original tech report on Quickprop gave results only for the case of
once-per-epoch weight updates.  I apologize for referring to my
implementation with once-per-batch weight updates and no zeroing 
between batches as "Fahlman's Quickprop."

What I *did* understand was that Quickprop's attempt to approximate
the error surface with a paraboloid was going to be fouled-up if the
"pictures" of the error surface gleaned from different batches were
substantially different.  Training for multiple iterations with
one batch, and then resetting the variables used in estimating the
shape of the error surface before going on to the next batch would
certainly eliminate the problem I described.

The prospect of choosing the number of iterations per batch does not
thrill me, however.  In general, I hate parameter tweaking.  From my
perspective, the worst thing about parameter tweaking is that we
don't really know how it affects the quality of the final network
obtained.  Also, exploring the effects of different parameter settings
takes too much of *my* time.  I want a procedure that does not require
tweaking and that runs at a reasonable fraction of the speed of a
"well-tuned" stochastic gradient descent procedure for a wide range of
problems.  (I haven't experimented with conjugate gradient descent yet,
but it seems to fit my bill.)

--Tom
  english at sun1.cs.ttu.edu



More information about the Connectionists mailing list