needed: complexity analyses of NN & evolutionary learning systems

alexis%yummy@gateway.mitre.org alexis%yummy at gateway.mitre.org
Wed Jul 20 08:43:55 EDT 1988


I'm not entirely sure I understand what you mean by:
>  ... generalized delta rule (aka backprop) constrains one to linearly
>  independent association tasks.
but I don't think it's correct.

If you mean linearly separable problems (ala Minsky & Papert) or
that the input vectors have to be orthogonal that is *definitely*
not true (see R. Lippmann, Introduction to Computing with Neural Nets, 
ASSP Mag, April 87; or D. Burr, Experiments on Neural Net Recognition 
of Spoken and Written Text, ASSP, V36#7, July 88; or A. Wieland & R. 
Leighton, Geometric Analysis of Neural Network Capabilities, ICNN 88)

By way of empirical demonstration, we've been using a multi-layer net 
with 2 inputs (representing an x and y coordinate) and 1 output (representing
class) to separate two clusters that spiral around each other ~3 times
to test some of our learning algorithms.  If anything *IS NOT* linearly 
separable, a spiral is not.


More information about the Connectionists mailing list