combining classifiers
Drucker Harris
hd at harris.monmouth.edu
Wed Jul 5 10:35:19 EDT 1995
Re: combining classifiers
For those interested in combining classifiers, I give
references to the boosting literature below. Boosting
gives an explicit method of building multiple classifiers
in a sequential fashion, rather than building the classifiers
first and then determining how to combine them.
The seminal work on boosting which shows that it is possible
to combine many classifiers, each with error rate slightly less than 1/2
(termed weak learners) to give a combined classifier with very good
performance (termed a strong learner):
Robert Schapire, "The strength of weak learnability", Machine Learning,
5(2), p 197-227
Applications to OCR may be seen in
H.Drucker, C. Cortes, L.Jackel, Y.LeCun, and V.Vapnik,
Boosting and Other Ensemble Methods, Neural Computation,
Vol 6, p1289-1301
Comparison of boosting techniques to many others in OCR is
given in:
LD Jackel, et.al., "Comparison of Classifier Methods: A Case Study in Handwritten
Digit Recognition", 1994 International Conference on Pattern
Recognition, Jerusalem, 1994.
In practical applications, it is helpful to have a very large
source of training data. However, there is a new version of boosting which
does not require this large data set:
Y.Freund and R.E. Schapire, "A decision-theoretic generalization of on-line
learning and an application to boosting, Proceedings of the Second European
Conference on Computational Learning Theory, March, 1995.
Harris Drucker
More information about the Connectionists
mailing list