SSAB questions

INS_ATGE%JHUVMS.BITNET@VMA.CC.CMU.EDU INS_ATGE%JHUVMS.BITNET at VMA.CC.CMU.EDU
Wed Jan 9 13:40:00 EST 1991


I am interested in using the Super Self-Adapting Back Propogation
(SuperSAB) algorithm recently published in _Neural_Networks_
(T. Tollenaere).  The algorithm published appears a bit
ambiguous to me.  In step four, it says "undo the previous weight update
(which caused the change in the gradient sign).  This can be done by using
[delta weight(i,j,n+1)] = -[delta weight(i,j,n)], instead of calculating
the weight-update..."

Does this mean undo the previous update of _all_ network weights,
or just undo the update of the particular weights which changed sign?

Anyway, I am going to try to use SuperSAB to speed up a time-delay neural net
(TDNN) of the sort used in Lang, Waibel, and Hinton _Neural_Networks_
3, p.23 for analysis of multiple-band infrared temporal intensity data.

While I am on ther subject, has anyone done a comparison between
Quickprop and SuperSAB, or has used SuperSAB in Cascade-Correlation or
Time Delay Neural Nets?

-Thomas Edwards



More information about the Connectionists mailing list