Combining estimators - which ones to combine
Intrator Nathan
nin at math.tau.ac.il
Wed Jun 28 05:30:15 EDT 1995
The focous of most of the papers cited so far on combining experts was
how to combine, while the issue of what to combine is as important.
The fundametal observation is that combining, or in the simple case
averaging estimators is effective only if these estimators are made
somehow to be independent.
One can cause independence via bootstrap methods (Breiman's Stacking
and recently, Bagging) or via smooth bootstrap which amounts to
injecting noise during training.
Ones estimators are independent enough, simple averaging gives very
good performance.
- Nathan Intrator
Refs:
@misc{Breiman93,
author="L. Breiman",
title="Stacked regression",
year=1993, note="Technical report, Univ. of Cal, Berkeley",
}
@misc{Breiman94,
author="L. Breiman",
title="Bagging predictors",
year=1994, note="Technical report, Univ. of Cal, Berkeley",
}
@misc{RavivIntrator95,
author="Y. Raviv and N. Intrator", year=1995, note="Preprint",
title="Bootstraping with Noise: An Effective Regularization Technique",
abstract="Bootsrap samples with noise are shown to be an effective
smoothness and capacity control for training feed-forward networks
as well as more traditional statistical models such as general
additive models. The effect of smoothness and ensemble averaging
are shown to be complementary and not equivalent to noise injection.
The two-spiral, a highly non-linear noise-free problem, is used to
demonstrate these findings.",
url="ftp://cns.math.tau.ac.il/papers/spiral.ps.Z",
}
The last one can also be accessed via my research page:
http://www.math.tau.ac.il/~nin/research.html
More information about the Connectionists
mailing list