Review Paper on Boosting available

Gunnar Raetsch raetsch at axiom.anu.edu.au
Sat Dec 21 18:26:19 EST 2002


Dear Connectionists,

we are pleased to announce that our new review paper entitled

      "An Introduction to Boosting and Leveraging"
      by Ron Meir & Gunnar R"atsch

is now available at

      http://www.boosting.org/papers/MeiRae03.ps.gz (and .pdf)
      (Copyright by Springer Verlag Heidelberg)

It will appear as a chapter of the Springer LNCS series book
``Advanced Lectures on Machine Learning'' at the beginning of next year.

Please find a summary and table of contents below.

Seasonal Greetings,

Ron & Gunnar


Abstract
========

   We provide an introduction to theoretical and practical aspects of
   Boosting and Ensemble learning, providing a useful reference for
   researchers in the field of Boosting as well as for those seeking to
   enter this fascinating area of research.  We begin with a short
   background concerning the necessary learning theoretical foundations
   of weak learners and their linear combinations.  We then point out
   the useful connection between Boosting and the Theory of
   Optimization, which facilitates the understanding of Boosting and
   later on enables us to move on to new Boosting algorithms,
   applicable to a broad spectrum of problems.  In order to increase
   the relevance of the paper to practitioners, we have added remarks,
   pseudo code, ``tricks of the trade'', and algorithmic considerations
   where appropriate. Finally, we illustrate the usefulness of Boosting
   algorithms by giving a brief overview of some existing applications.
   The main ideas are illustrated on the problem of binary
   classification, although several extensions are discussed.

Table of contents:
==================

1      A Brief History of Boosting

2      An Introduction to Boosting and Ensemble Methods
2.1    Learning from Data and the PAC Property
2.2    Ensemble Learning, Boosting and Leveraging

3      Learning Theoretical Foundations of Boosting
3.1    The Existence of Weak Learners
3.2    Convergence of the Training Error to Zero
3.3    Generalization Error Bounds
3.4    Margin based Generalization Bounds
3.5    Consistency

4      Boosting and Large Margins
4.1    Weak learning, Edges and Margins
4.2    Geometric Interpretation of p-Norm Margins
4.3    AdaBoost and Large Margins
4.4    Relation to Barrier Optimization

5      Leveraging as Stagewise Greedy Optimization
5.1    Preliminaries
5.2    A Generic Algorithm
5.3    The Dual Formulation
5.4    Convergence Results

6      Robustness, Regularization and Soft-margins
6.1    Reducing the Influence of Examples
6.2    Optimization of the Margins
6.3    Regularization Terms and Sparseness

7      Extensions
7.1    Single Class
7.2    Multi-Class
7.3    Regression
7.4    Localized Boosting
7.5    Other extensions

8      Evaluation and Applications
8.1    On the choice of weak learners for Boosting
8.2    Evaluation on Benchmark Data Sets
8.3    Applications

9      Conclusions


+-----------------------------------------------------------------+
 Gunnar R"atsch                     http://mlg.anu.edu.au/~raetsch
 Australian National University   mailto:Gunnar.Raetsch at anu.edu.au
 Research School for Information            Tel: (+61) 2 6125-8647
 Sciences and Engineering                   Fax: (+61) 2 6125-8651
 Canberra, ACT 0200, Australia              Mob: (+61) 401 10-2235





More information about the Connectionists mailing list