Extensions of the SMO algorithm

S. Sathiya Keerthi mpessk at guppy.mpe.nus.edu.sg
Mon Apr 29 01:00:42 EDT 2002


Dear Connectionists:

We have recently completed two papers on the extensions of the 
SMO algorithm to Least Squares SVM formulations and Kernel Logistic 
Regression. Gzipped postscript files containing these papers can be
downloaded from: http://guppy.mpe.nus.edu.sg/~mpessk/svm.shtml
The titles and abstracts of these papers are given below.

S. Keerthi

--------------------------------------------------------------
SMO Algorithm for Least Squares SVM Formulations 
S.S. Keerthi and S.K. Shevade 

This paper extends the well-known SMO algorithm of Support Vector 
Machines (SVMs) to Least Squares SVM formulations which include 
LS-SVM classification, Kernel Ridge Regression and a particular 
form of regularized Kernel Fisher Discriminant. The algorithm is 
shown to be asymptotically convergent. It is also extremely easy 
to implement. Computational experiments show that the algorithm 
is fast and scales efficiently (quadratically) as a function of 
the number of examples. 
--------------------------------------------------------------
A Fast Dual Algorithm for Kernel Logistic Regression 
S.S. Keerthi, K. Duan, S.K. Shevade and A.N. Poo 
(Accepted for presentation at ICML 2002)

This paper gives a new iterative algorithm for kernel logistic 
regression. It is based on the solution of the dual problem using 
ideas similar to those of the SMO algorithm for Support Vector 
Machines. Asymptotic convergence of the algorithm is proved. 
Preliminary computational experiments show that the algorithm is 
robust and fast. The algorithmic ideas can also be used to give a 
fast dual algorithm for solving the optimization problem arising 
in the inner loop of Gaussian Process classifiers. 
--------------------------------------------------------------






More information about the Connectionists mailing list