Connectionists: Papers and code about SVM training

Olivier Chapelle olivier.chapelle at tuebingen.mpg.de
Fri Sep 1 09:20:00 EDT 2006


Dear colleagues,

I would like to announce the availability of papers and code related to SVM training.

Training an SVM in the primal
-----------------------------

Abstract from http://www.kyb.mpg.de/publication.html?publ=4142:

Most literature on Support Vector Machines (SVMs) concentrate on the dual optimization 
problem. In this paper, we would like to point out that the primal problem can also be 
solved efficiently, both for linear and non-linear SVMs, and that there is no reason for 
ignoring this possibilty. On the contrary, from the primal point of view new families of 
algorithms for large scale SVM training can be investigated.

Additional details and Matlab code can be found at 
http://www.kyb.tuebingen.mpg.de/bs/people/chapelle/primal/

In particular, you will find a link to another paper explaining how to train an SVM with 
very few basis functions:
S. S. Keerthi, O. Chapelle, D. DeCoste,  Building Support Vector Machines with Reduced 
Classifier Complexity, JMLR 7, 2006.

Learning kernel parameters
--------------------------

Some code based on the work I did during my PhD is available at: 
http://www.kyb.tuebingen.mpg.de/bs/people/chapelle/ams/

It includes:
  - learning the kernel parameters of an SVM (classification) or GP (regression) by 
gradient descent on either the leave-one-out error, the radius/margin bound, a validation 
error or the marginalized likelihood.
  - learning a linear combination of kernels (this is a convex problem)
  - estimating efficiently the leave-one-out error of an SVM.

Olivier Chapelle


More information about the Connectionists mailing list