New book: Pattern Classification

stork stork at rsv.ricoh.com
Tue Nov 7 01:36:56 EST 2000


                         Announcing a new book:

                    Pattern Classification (2nd ed.)
               by R. O. Duda, P. E. Hart and D. G. Stork
                     654 pages, two-color printing
                      (John Wiley and Sons, 2001)
                          ISBN:  0-471-05669-3

This is a significant revision and expansion of the first half of
Pattern Classification & Scene Analysis, R. O. Duda and P. E. Hart's
influential 1973 book.  The current book can serve as a textbook for a
one- or two-semester graduate course in pattern recognition, machine
learning, data mining and related fields offered in Electrical
Engineering, Computer Science, Statistics, Operations Research,
Cognitive Science, or Mathematics departments.  Established researchers
in any domain relying on pattern recognition can rely on the book as a
reference on the foundations of their field.

Table of Contents

1)   Introduction
2)   Bayesian Decision Theory
3)   Maximum Likelihood and Bayesian Estimation
4)   Nonparametric Techniques
5)   Linear Discriminant Functions
6)   Multilayer Neural Networks
7)   Stochastic Methods
8)   Nonmetric Methods
9)   Algorithm-Independent Machine Learning
10) Unsupervised Learning and Clustering
Mathematical Appendix

Goals

   * Authoritative:  The presentations are based on the best research
     and rigorous fundamental theory underlying proven techniques.
   * Complete:  Every major topic in statistical, neural network and
     syntactic pattern recognition is presented, including all the
     topics that should be in the "toolbox" of designers of practical
     pattern recognition systems.
   * Up-to-date:  The book includes the most recent proven techniques
     and developments in the theory of pattern recognition.
   * Clear:  Every effort has been made to insure that the text is
     clearly written and will not be misinterpreted.  The manuscript was
     tested in over 100 courses worldwide, and numerous suggestions from
     students, teachers and established researchers have been
     incorporated.  Every attempt has been made to give the deepest
     explanation, providing insight and understanding rather than a
     laundry list of techniques.
   * Logically organized: The book is organized so as to build upon
     concepts and techniques from previous chapters, so as to speed the
     learning of the material.
   * Problem motivated, not technique motivated:  Some books focus on a
     particular technique or method, for instance neural nets.  The
     drawback of such books is that they highlight the particular
     technique, often at the expense of other techniques.  Readers are
     left wondering how the particular highlighted technique compares
     with others, and especially how to decide which technique is
     appropriate for which particular problem.  Pattern Classification
     instead assumes that practioners come first with a problem or class
     of problems, and seek a solution, using whichever technique is most
     appropriate.  There are many pattern recognition problems for which
     neural networks (for instance) are ill-suited, and readers of
     alternative texts that focus on neural networks alone may be misled
     and believe neural networks are applicable to their problem.  As
     the old saying goes, "if you're a hammer, every problem looks like
     a nail."   Pattern Classification rather seeks to be a balanced and
     complete toolbox -- plus instructions on how to choose the right
     tool for the right job.
   * Long-lived:  Every effort has been made to ensure the book will be
     useful for a long time, much as the first edition reamained useful
     for over a quarter of a century.  For instance, even if a technique
     has vocal proponents, if that technique has not found genuine use
     in a challenging problem domain, it is not discussed in depth in
     the book.  Further, the notation and terminology are consistent and
     standardized as generally accepted in the field.

New topics

   * Neural Networks, including Hessians and second-order training and
     pruning techniques, popular heuristics for training and
     initializing parameters, and recurrent networks.
   * Stochastic methods, including simulated annealing, genetic
     algorithms, Boltzmann learning, and Gibbs sampling.
   * Nonmetric methods, including tree classifiers such as CART, ID3 and
     their descendents, string matching, grammatical methods and rule
     learning
   * Theory of learning, including the No Free Lunch theorems, Minimum
     Description Length (MDL) principle, Occam's principle,
     bias-variance in regression and classification, jackknife and
     bootstrap estimation, Bayesian model comparison and MLII,
     multi-classifier systems and resampling techniques such as
     boosting, bagging and cross validation.
   * Support Vector Machines, including the relationship between
     "primal" and "dual" representations.
   * Competitive learning and related methods, including Adaptive
     Resonance Theory (ART) networks and their relation to
     leader-follower clustering.
   * Self-organizing feature maps, including maps affected by the
     sampling density.

New/improved features and resources

   * Solution Manual:  A solution manual is available for faculty
     adopting the text.
   * New and redrawn figures:  Every figure is carefully drawn (and all
     figures from the 1st edition have been updated and redrawn) using
     modern 3D graphics and plotting programs, all in order to
     illuminate ideas in a richer and more memorable way.  Some (e.g.,
     3D Voronoi tesselations and novel renderings of stochastic search)
     appear in no other pattern recogntion books and provide new insight
     into mathematical issues.  A complete set of figures is available
     for non-commercial purposes from
     http://www.wiley.com/products/subject/engineering/electrical/software_supplem_elec_eng.html
     and ftp://ftp.wiley.com/public/sci_tech_med/pattern.
   * Two-color printing in figures and text: The use of red and black
     throughout allows more information to be conveyed in the figures,
     where color can for instance indicate different categories, or
     different classes of solution, or stages in the development of
     solutions.
   * Pseudocode: Key algorithms are illustrated in language-independent
     pseudocode.  Thus students can implement the algorithms in their
     favorite computer language.
   * Worked Examples: Several techniques are illustrated with worked
     examples, using data sets simple enough that students can readily
     follow the technical details.  Such worked examples are
     particularly helpful to students tackling homework problems.
   * Extensive Bibliographies: Each chapter contains an extensive and
     up-to-date bibliography with detailed citation information,
     including the full names (first name and surname) of every author.
   * Chapter Summaries: Each chapter ends with a summary highlighting
     key points and terms.  Such summaries reinforce the presentation in
     the text and facilitate rapid review of the material.
   * Homeworks: There are 380 homework problems, each keyed to its
     corresponding section in the text.
   * Computer Exercises: There are 102 language-independent computer
     exercises, each keyed to a corresponding section and in many cases
     also to explicit pseudocode in the text.
   * Starred sections: Some sections are starred to indicate that they
     may be skipped on a first reading, or in a one-semester course.
   * Key words listed in margins: Key words and topics are listed in the
     margins where they first appear, to highlight new terms and to
     speed subsequent search and retrieval of relevant information.





More information about the Connectionists mailing list