Does backprop need the derivative ??
Ljubomir Buturovic
ljubomir at darwin.bu.edu
Sat Feb 6 11:17:56 EST 1993
Mr. Heini Withagen says:
> I am working on an analog chip implementing a feedforward
> network and I am planning to incorporate backpropagation learning
> on the chip. If it would be the case that the backpropagation
> algorithm doesn't need the derivative, it would simplify the
> design enormously.
We have trained multilayer perceptron without derivatives,
using simplex algorithm for multidimensional optimization
(not to be confused with simplex algorithm for linear
programming). From our experiments, it turns out that it
can be done, however the number of weights is seriously
limited, since the memory complexity of simplex is N^2,
where N is the total number of variable weights in the
network. See reference for further details (the reference
is available as a LaTeX file from ljubomir at darwin.bu.edu).
Lj. Buturovic, Lj. Citkusev, ``Back Propagation and
Forward Propagation,'' in Proc. Int. Joint Conf. Neural
Networks, (Baltimore, MD), 1992, pp. IV-486 -- IV-491.
Ljubomir Buturovic
Boston University
BioMolecular Engineering Research Center
36 Cummington Street, 3rd Floor
Boston, MA 02215
office: 617-353-7123
home: 617-738-6487
More information about the Connectionists
mailing list