Preprint- Alopex: A corr. based learning alg.
K.P.Unnikrishnan
unni at neuro.cs.gmr.com
Sat Mar 27 01:45:54 EST 1993
The following tech report is now available. For a hard copy, please send
your surface mail address to venu at neuro.cs.gmr.com.
Unnikrishnan
------------------------------------------------------------
Alopex: A Correlation-Based Learning Algorithm for Feed-Forward and
Recurrent Neural Networks
K. P. Unnikrishnan
General Motors Research Laboratories
and
K. P. Venugopal
Florida Atlantic University
We present a learning algorithm for neural networks, called Alopex. Instead of
error gradient, Alopex uses local correlations between changes in individual
weights and changes in the global error measure. The algorithm does not make
any assumptions about transfer functions of individual neurons, and does not
explicitly depend on the functional form of the error measure. Hence, it can
be used in networks with arbitrary transfer functions and for minimizing a
large class of error measures. The learning algorithm is the same for feed-
forward and recurrent networks. All the weights in a network are updated
simultaneously, using only local computations. This allows complete
parallelization of the algorithm. The algorithm is stochastic and it uses
a `temperature' parameter in a manner similar to that in simulated annealing.
A heuristic `annealing schedule' is presented which is effective in finding
global minima of error surfaces. In this paper, we report extensive
simulation studies illustrating these advantages and show that learning times
are comparable to those for standard gradient descent methods. Feed-forward
networks trained with Alopex are used to solve the MONK's problems and symmetry
problems. Recurrent networks trained with the same algorithm are used for
solving temporal XOR problems. Scaling properties of the algorithm are
demonstrated using encoder problems of different sizes and advantages of
appropriate error measures are illustrated using a variety of problems.
More information about the Connectionists
mailing list