PCA algorithms, continued.

Terence D. Sanger tds at ai.mit.edu
Tue Dec 7 18:40:47 EST 1993


In response to my previous message, many people have sent me new references
to PCA algorithms, and these have been included in the BibTex database
pca.bib.  (Also note Wang's more extensive pclist.tex file announced
recently on this net.)  

Errki Oja has been kind enough to forward copies of some of his
recent papers on the "Weighted Subspace Algorithm" and "Nonlinear PCA".
Looking at these carefully, I think both algorithms are closely related to
Brockett's algorithm, and probably work for the same reason.  I have
created another short derivation "oja.tex" which is available along with
the updated pca.bib by anonymous ftp from ftp.ai.mit.edu in the directory
pub/sanger-papers.   

One could invoke some sort of transitivity property to claim that since
Oja's algorithms are related to Brockett's, Brockett's are related to GHA,
and GHA does deflation, then Oja's algorithms must also do deflation.  This
would imply that Oja's algorithms also satisfy the hypothesis: 

"All algorithms for PCA which are based on a Hebbian learning rule must
use sequential deflation to extract components beyond the first." 

But I must admit that the connection is becoming somewhat tenuous.
Probably the hypothesis should be interpreted as a vague description of a
motivation for the computational mechanism, rather than a direct
description of the algorithm.  However, I still feel that it is important
to realize the close relationship between the many algorithms which use
Hebbian learning to find exact eigenvectors.

As always, comments/suggestions/counterexamples/references are welcomed!

				Terry Sanger


Instructions for retrieving latex documents:

ftp ftp.ai.mit.edu
login: anonymous
password: your-net-address
cd pub/sanger-papers
get pca.bib
get oja.tex
quit
latex oja
lpr oja.dvi



More information about the Connectionists mailing list