radial basis functions

M. Niranjan niranjan%digsys.engineering.cambridge.ac.uk at NSS.Cs.Ucl.AC.UK
Mon Sep 26 00:03:02 EDT 1988


Re: Hideki KAWAHARA's recent postings on Radial basis functions

Radial basis functions as pattern classifiers is a kind of Kernel
discriminant analysis ("Kernel Discriminant Analysis" by Hand, Research
Studies Press, 1982).

In KDA, a class conditional probability density function is estimated as
a weighted sum of kernel functions centred on the training examples (and
then Bayes' type classification); in RBF, the discriminant function
itself is calculated as a weighted sum of kernel functions.
In this sense, RBF is superior to KDA, I think.

It forms class boundaries by segments of hyper-spheres (rather than
hyper-planes for a BP type network).

Something very similar to RBF is the method of potential functions.
This works something like placing weighted electric charges on every
training example and the equi-potential lines act as class boundaries.
I think the green book by Duda and Hart mention this, but the original
reference is,
    Aizerman, M.A., Braverman, E.M. \& Rozonoer, L.I. (1964):
    ``On the method of potential functions'';
    Avtomatika i Telemekhanika, {\bf Vol. 26, No. 11}, 2086-2088.
    (This is in Russian, but there is a one-to-one translation in most
     electrical engineering libraries)

Also, if you make a network with one hidden layer of 'spherical graded
units' (Hanson and Burr, "Knowledge representation in connectionist
networks"), and a simple perceptron as output unit (plus some simplifying
assumptions), you can derive the RBF method!!



niranjan



More information about the Connectionists mailing list