Paper avaliable Re: Gradient Descent and Boosting

Nigel Duffy nigeduff at cse.ucsc.edu
Fri Mar 12 13:38:57 EST 1999


David Helmbold and I have the following paper relating boosting to gradient 
descent. This relationship is used to derive an algorithm and prove 
performance bounds on this new algorithm. 



			A Geometric Approach to 
			Leveraging Weak Learners

			    Nigel Duffy and
			    David Helmbold

			University of California
			      Santa Cruz

				ABSTRACT

AdaBoost is a popular and effective leveraging procedure for
improving the hypotheses generated by weak learning algorithms.
AdaBoost and many other leveraging algorithms can be viewed as 
performing a constrained gradient descent over a potential function.
At each iteration the distribution over the sample given to the
weak learner is the direction of steepest descent. We introduce a new 
leveraging algorithm based on a natural potential function. For this
potential function, the direction of steepest descent can have 
negative components.  Therefore we provide two transformations for 
obtaining suitable distributions from these directions of steepest 
descent. The resulting algorithms have bounds that are incomparable to 
AdaBoost's, and their empirical performance is similar to AdaBoost's.

To appear in EuroColt 99, to be published by Springer Verlag.

Available from:

"http://www.cse.ucsc.edu/research/ml/papers/GeometricLeveraging.ps"






More information about the Connectionists mailing list