The New Training Alg for Feedforward Networks

Dr. S. Kak kak at max.ee.lsu.edu
Fri Mar 5 12:54:05 EST 1993


Hard copies of the below-mentioned paper are now available [until
they are exhausted]. -Subhash Kak
-------------------------------------------------------------
Pramana - Journal of Physics, vol. 40, January 1993, pp. 35-42
--------------------------------------------------------------

On Training Feedforward Neural Networks

Subhash C. Kak
Department of Electrical & Computer Engineering
Louisiana State University
Baton Rouge, LA 70803-5901, USA

Abstract:  A new algorithm that maps n-dimensional binary vectors
into m-dimensional binary vectors using 3-layered feedforward neural
networks is described.  The algorithm is based on a representation
of the mapping in terms of the corners of the n-dimensional signal
cube.  The weights to the hidden layer are found by a corner
classification algorithm and the weights to the output layer are
all equal to 1.  Two corner classification algorithms are described.
The first one is based on the perceptron algorithm and it performs
generalization.  The computing power of this algorithm may be gauged
from the example that the exclusive-Or problem that requires several
thousand iterative steps using the backpropagation algorithm was
solved in 8 steps. For problems with 30 to 100 neurons we have found
a speedup advantage ranging from 100 to more than a thousand fold.
Another corner classification algorithm presented
in this paper does not require any computations to find the weights.
However, in its basic form this second classification procedure
does not perform generalization.  The new algorithm described in this
paper is guaranteed to find the solution to any mapping problem.  The
effectiveness of this algorithm is due to the structured nature of the
final design. This algorithm can also be applied to analog data.
The new algorithm is computationally extremely efficient compared
to the backpropagation algorithm.  It is also biologically plausible
since the computations required to train the network are extremely
simple.




More information about the Connectionists mailing list