No subject

dhw@santafe.edu dhw at santafe.edu
Fri Mar 12 17:23:38 EST 1993


Dr.'s Kak and Almeida talk about training issues concerning the
XOR problem.

One should be careful not to focus too heavilly on the XOR problem.
Two points which I believe have been made previously on connectionists
bear repeating:


Consider the n-dimensional version of XOR, namely n-bit parity.

1) All "local" algorithms (e.g., weighted nearest neighbor) perform
badly on parity. More precisely, as the number of training examples
goes up, their off-training set generalization *degrades*, asymptoting
at 100% errors. This is also true for backprop run on neural nets, at
least for n = 6.

2) There are algorithms which perform *perfectly* (0 training or
generalizing errors) for the parity problem. Said algorithms are not
designed in any way with parity in mind.

In other words, in some senses, for all the problems it causes local
algorithms, parity is not "difficult".



David Wolpert



More information about the Connectionists mailing list