SSAB questions

Luis Borges de Almeida inesc!lba at relay.EU.net
Thu Jan 10 15:58:50 EST 1991


Richard Rohwer has presented at last NIPS conference, a comparison
among a number of acceleration techniques, in various problems. Among
these techniques, is one which is quite similar to SuperSAB. This
technique was developed by a colleague and myself, independently from
Tollenaere's work (see references below; reprints can be sent to
anyone interested). I don't recall seeing tests on Quickprop, but Richard
had tests on Le Cun's diagonal second-order method, which I believe to
be similar, and perhaps a bit faster, than Quickprop.

Richard's data showed better results for Le Cun's method than for ours
in many problems, but we found out, while talking to Richard, that he
had missed a (probably important) step of the algorithm. I think he
may have gone to the work of redoing the tests, you might want to
contact him directly.

In short, the main difference between our algorithm and Tollenaere's,
is that we only undo a weight update if it has caused an increase in
the objective function (the quadratic error accumulated over all
outputs and all trainig patterns).

Luis B. Almeida

INESC                             Phone: +351-1-544607
Apartado 10105                    Fax:   +351-1-525843
P-1017 Lisboa Codex
Portugal

lba at inesc.inesc.pt
lba at inesc.uucp                    (if you have access to uucp)

---------------------

REFERENCES

F. M. Silva and L. B. Almeida, "Acceleration Techniques for the
Backpropagation Algorithm", in L. B. Almeida and C. J. Wellekens
(eds), Neural Networks, Proc. 1990 EURASIP Workshop, Sesimbra,
Portugal, Feb. 1990, New York: Springer-Verlag (Lecture Notes in Computer
Science series).

F. M. Silva and L. B. Almeida, "Speeding up Backpropagation", in R.
Eckmiller (ed), Advanced Neural Computers, Amsterdam: Elsevier Science
Publishers, 1990.


More information about the Connectionists mailing list