The New Training Alg for Feedforward Networks
Dr. S. Kak
kak at max.ee.lsu.edu
Tue Mar 16 15:00:04 EST 1993
Drs. Almeida and Fahlman have commented on how the attribution
that backpropagation takes several thousand steps for the XOR
problem (whereas my new algorithm takes only 8 steps) may not be
fair. This attribution was not supposed to refer to the best
BP algorithm for the problem; it was taken from page 332 of
PDP, Vol. 1., and it was meant to illustrate the differences in
the two algorithms.
As has been posted by others here the new algorithm seems to
give a speedup of 100 to 1000 for neurons in the range
of 50 to 100. Certainly further tests are called for. The
introduction of a learning rate in the new algorithm and
learning with respect to an error criterion improve the
performance of the new algorithm. These modifications will be
described in a forthcoming report.
-Subhash Kak
More information about the Connectionists
mailing list