Summary (long): pattern recognition comparisons

Laveen N. KANAL kanal at cs.UMD.EDU
Thu Jul 26 22:51:18 EDT 1990


Hello folks,

I have been into pattern recognition methodologies a long time and can tell you
that comparison of techniques is a tricky business involving questions of
dimensionality, sample size, and error estimation procedures. For some
information on these matters, see e.g., l.kanal, Patterns in Pattern rRecognition,:1968-1974, IEEE trans. on Information theory, vol IT-20,no6, Nov. 1974,
and articles in Krishnaiah and Kanal(eds), handbook of statistics 2-Classification, Pattern recognition and Reduction of Dimensionality, North-Holland, 1982, and
various papers by Fukunaga in IEEE Trans. on PAMI.  The papers by Halbert White
in AI Expert Dec. 89, and Neural Computation vol. 1  indicate, that, as  in the
perceptron days, many of the NN algortihms can be shown to be intimately related
to stochastic approximation methods ala Robbins-Munro, Dvoretsky etc.(see Skalansky & Wassel, Pattern Classifiers  and Trainable Machines, Springer, 1981). But
the results showing "Multilayer Feedforward networks are Universal Approximators",
by Hornik, et al in Neural Networks, and subsequent publications along that line
suggest the multilayer networks offer access to more interesting possibilities
than previous  "one-shot" statistical classifiers.  The one-shot is in  contrast to
heirarchical statistical classifiers or statistical decision trees ( see 
Dattatreya & kanal, " Decision Trees in Pattern recognition, pp 189-239,
in Progress in Pattern recognition 2, Kanal & Rosenfeld(eds), North-Holland,1985). Using an interactive pattern analysis and classification system to design
statistical decision trees on some industrial inspection data from Global holonetics(now unfortunately defunct) I found that a 5 minute session with the i interactive system ISPAHAN/IPACS led to a two level classifier implementing Fisher dis
discriminants at each level, that got an error rate which was generally better than that which Global Holonetics had obtained using a NN board and error backpropogation running several thousand  iterations on the training set.

The point is that the same technology that is making Neural net  simulators so
user friendly is also available now to make IPACS user friendly. However, as in the early days ( see L.Kanal, Interactive Pattern Analysis and Classification Systems--A survey and Commentary, Proc. IEEE, vol 60, pp 1200-1215, Oct. 72), the
NN algorithms are a lot easier for many engineers and computer scientists to get
into than the highly mathematical statistical pattern recognition procedures.
 But pattern recognition continues to be a "bag of problems and a bag of tools",
and it behoves us to understand the various tools available to us rather than
expecting any one given methodolgy to do it all or do it consistently better than all others. So we have statistical, linguistic, structural( grammars, AND/or Grphs, SOME/OR graphs, ), ANN, Genetic algorithms, etc. methods available and of
en a hybrid  approach will be what satisfies a problem domain). As has been "he who only has a hammer, thinks the whole world is a nail".

I think the recent developments in ANN's are quite exciting and there are many new
 challenging problems to understand and resolve. But there is no free lunch, and we
should not expect ANN's to free us from having to think hard about the true nature
of the pattern recognition problems we  wish to solve.

By the way our quicke comparison of ISPAHAN/IPACS with BP was presented in a paper
, Kanal, Gelsema, et al, "Comparing Hierarchical Statistical Classifiers with
Error Back propogation Neural nets ", Vision 89, Society for Manufacturing Enginees, detroit, 1989.

Please excuse the long note. I would have made it shorter if I had more time.

L.K.


More information about the Connectionists mailing list