combining generalizers' guesses

john kolen kolen-j at cis.ohio-state.edu
Tue Jul 27 11:24:38 EDT 1993


>Barak Pearlmutter <bap at learning.siemens.com>
>For instance, if you run backpropagation on the same data twice, with
>the same architecture and all the other parameters held the same, it
>will still typically come up with different answers.  Eg due to
>differences in the random initial weights.

[Blatant self promotion follows, but I looking for a job so I need all the
 promotion I can get ;-]

A very vivid example of this can found in

John. F. Kolen and Jordan. B. Pollack, 1990.  Backpropagation is Sensitive to 
Initial Conditions.  _Complex Systems_. 4:3. pg 269-280.  Available from 
neuroprose: kolen.bpsic.*

or

John. F. Kolen and Jordan. B. Pollack, 1991.  Backpropagation is Sensitive to 
Initial Conditions.  NIPS 3. pg 860-867.



>  Averaging out this effect is a guaranteed win.
>					--Barak.

This statement seems rely on an underlying assumption of convexity, which
does not necessarily hold when you try to combine different processing
strategies (ie. networks).  If you massage your output representations so
that linear combinations will always give you reasonable results, that's
great.  But it's not always the case that you have the leeway to make such
a representational commitment.

John


More information about the Connectionists mailing list