Paper available: Adaptive fuzzy min-max estimation
Payman Arabshahi
payman at fermi.jpl.nasa.gov
Mon Sep 29 14:43:30 EDT 1997
The following paper is now available online via:
http://dsp.jpl.nasa.gov/~payman (under "Publications")
or via anonymous ftp:
ftp://dsp.jpl.nasa.gov/pub/payman/tcas9701.ps.gz
(564842 bytes gzip compressed or 2306003 bytes uncompressed)
---
Payman Arabshahi
Jet Propulsion Laboratory Tel: (818) 393-6054
4800 Oak Grove Drive Fax: (818) 393-1717
MS 238-343 Email: payman at jpl.nasa.gov
Pasadena, CA 91109
--------------------------------------------------------------------------
TITLE: Pointer adaptation and pruning of min-max fuzzy inference and
estimation.
AUTHORS: Arabshahi-P. Marks-R-J. Oh-S. Caudell-T-P. Choi-J-J.
SOURCE: IEEE Transactions on Circuits and Systems II - Analog and
Digital Signal Processing. Vol. 44, no. 9, Sept. 1997,
p.696-709.
ABSTRACT: A new technique for adaptation of fuzzy membership functions in
a fuzzy inference system is proposed, The painter technique
relies upon the isolation of the specific membership functions
that contributed to the final decision, followed by the
updating of these functions' parameters using steepest descent,
The error measure used is thus backpropagated from output to
input, through the min and max operators used during the
inference stage, This occurs because the operations of min and
max are continuous differentiable functions and, therefore, can
be placed in a chain of partial derivatives for steepest
descent backpropagation adaptation, Interestingly, the partials
of min and max act as ''pointers'' with the result that only
the function that gave rise to the min or max is adapted; the
others are not, To illustrate, let alpha = max [beta(1),
beta(2), ..., beta(N)]. Then partial derivative alpha/partial
derivative beta(n) = 1 when beta(n) is the maximum and is
otherwise zero, We apply this property to the fine tuning of
membership functions of fuzzy min-max decision processes and
illustrate with an estimation example, The adaptation process
can reveal the need for reducing the number of membership
functions, Under the assumption that the inference surface is
in some sense smooth, the process of adaptation can reveal
overdetermination of the fuzzy system in two ways, First, if
two membership functions come sufficiently close to each other,
they can be fused into a single membership function, Second, if
a membership function becomes too narrow, it can be deleted, In
both cases, the number of fuzzy IF-THEN rules is reduced, In
certain cases, the overall performance of the fuzzy system ran
be improved by this adaptive pruning.
--------------------------------------------------------------------------
More information about the Connectionists
mailing list