reply on back-propagation fails to separate paper
Richard Lippmann
rpl at ll-sst.arpa
Fri Oct 28 09:42:23 EDT 1988
Geoff,
We came up with the same conclusion a while ago when
some people were worried about the performance of back propagation
but never published it. Back propagation with limits seems to converge
correctly for those contrived deterministic cases where minimizing
total squared error does not minimize the percent patterns
classified correctly. The limits cause the algorithm to change from
an LMS mean-square minimizing approach to perceptron-like
error corrective approach. Typically, however, the difference in percent
patterns classified correctly between the local and global
solutions in those cases tends to be small.
In practice, we found that convergence for the one contrived
case we tested with limits took rather a long time.
I have never seen this published and it would be good to see
your result published with a convergence proof. I have also seen
little published on the effect of limits on performance of classifiers
or on final weight values.
Rich
More information about the Connectionists
mailing list