more RBF refs
sussmann@hamilton.rutgers.edu
sussmann at hamilton.rutgers.edu
Fri Sep 13 13:02:11 EDT 1991
> Date: Wed, 11 Sep 91 07:39:09 PDT
> From: Sean Lehman <lehman at pinkpanther.llnl.gov>
>
> I think you are confusing gradient descent, a mathematical method for
> finding a local mininum, with backpropagation, a learning algorithm
> for artificial neural networks.
I don't quite understand the distinction. Backpropagation is of
course "a learning algorithm for artificial neural networks," but it
consists of using gradient descent to look for a local minimum of a
function. (And, yeah, to compute the gradient one uses the chain
rule.) You seem to be saying that, because backpropagation is not
gradient descent in general, but gradient descent in a special case,
then it's not gradient descent. Or am I missing something?
---Hector Sussmann
More information about the Connectionists
mailing list