PCA reference

amari@sat.t.u-tokyo.ac.jp amari at sat.t.u-tokyo.ac.jp
Sat Dec 4 13:43:53 EST 1993


    I have copied Dr.Sanger's  very useful bibliography on PCA.  I would like to add one "prehistoric" reference.  In the paper

     S.Amari, Neural Theory of Association and Concept Formation, Biological
        Cybernetics, vol.26, 175 - 185, 1977,

I discussed the general aspect of neural learning of the form

     (d/dt)w = -cw + c'rx,

where w is the synaptic weight vector, c and c' are constant, x is the input
vector and r is the "learning signal" depending on w, x and an extra signal.
 I have proved the existence of the
potential or Lyapunov function in various types of neural learning.  The case
that r = w.x was also remarked.  In p.179, one can see the following statement.
"If the connection weight is subject to the subsidiary condition w.w = const,
so that w(t) is normalized after each step of learning, we can prove that w(t)
converges to the minimum of L(w) under the subsidiary condition.  It is the
direction of the eigenvector of the matrix <xx'> corresponding to the maximum
eigenvalue."  Here L(w) is the Lyapunov function which is a special case of
more general one I proposed, and <xx'> is the covariance matrix (second order
moment matrix) of the input signals.

    This was only a few lines of description, and the main theme of this
paper was not the neural PCA.  So this was only a prehistory.  But I think
that someone might have interest in a prehistoric anecdote.








More information about the Connectionists mailing list