combining estimators w/ non-constant weighting

Dr. Xu Lei lxu at cs.cuhk.hk
Thu Jun 29 04:56:31 EDT 1995


David Wolpert wrote:
>>>>
However I took Gil's question (perhaps incorrectly) to concern the
combination of *arbitrary* types of estimators, which in particular
includes estimators (like nearest neighbor) that need not be
parametric and therefore can not readily be "co-opted". (Certainly the
work she listed, like Sharif's, concerns the combination of such
arbitrary estimators.) This simply is not the concern of most of the
work on AME.
>>>>

The following two papers with AME concern this type of work. Actually,
this type of work can be treated as a special case of Mixture of Experts
---Some experts have been pretrained and fixed and only the gating net
and other experts need to be trained in ME learning.

 Lei Xu and M.I.Jordan (1993), ``EM Learning on A Generalized Finite
Mixture Model for Combining Multiple Classifiers'', Proceedings of
World Congress on Neural Networks, Portland, OR, Vol. IV, 1993,

Lei Xu, M.I.Jordan and  G. E. Hinton (1995), `` An Alternative  Model  for mixtures of Experts", to appear on  {\em Advances in Neural Information
Systems 7}, eds., Cowan, J.D., Tesauro, G., and Alspector, J., 
MIT Press, Cambridge MA, 1995.


More information about the Connectionists mailing list