Principal Components algorithms

Luis B. Almeida lba at ilusion.inesc.pt
Fri Nov 19 05:29:39 EST 1993


Dear Terence, dear Connectionists,

Terence writes:

> I propose the following hypothesis:
>   "All algorithms for PCA which are based on a Hebbian learning rule must
> use sequential deflation to extract components beyond the first." 

I agree with that hypothesis, in what concerns most of the PCA
algorithms. However, I am not sure of that for all algorithms. One of
them is the "weighted subspace" algorithm of Oja et al. (see
ref. below). The simplest way I have found to interpret this algorithm
is as a weighted combination of Williams' error-correction learning
(or the plain subspace algorithm, which is the same) and Oja's
original Hebbian rule. If one takes into account the realtive weights
of both, which are different from one unit to another, it is rather
easy to understand that the algorithm should extract the principal
components. I can give more detail on this interpretation, if people
find it useful. 



More information about the Connectionists mailing list