Bayesian Inference in Support Vector Machines

Chu Wei engp9354 at nus.edu.sg
Thu Jan 24 10:46:13 EST 2002


Dear Connectionists: 

We have recently completed two technical reports in which we apply popular
Bayesian techniques in support vector machines to implement hyperparameter
tuning. In a probabilistic framwork, Bayesian inference is used to implement
model adaption, while keeping the merits of support vector machines, such as
sparseness and convex quadratic programming. Another benifit is the
availability of probabilistic prediction. The results in numerical
experiments verify that the generalization capability of the Bayesian
methods is competitive and it is feasible to tackle reasonable large data
sets in this approach. 

The pdf files of these reports can be accessed at:
For regression: http://guppy.mpe.nus.edu.sg/~mpessk/papers/bisvr.pdf
For classification: http://guppy.mpe.nus.edu.sg/~mpessk/papers/bitsvc.pdf 
We are looking forward to your comments to improve this work. Thanks.

We attach their abstract in the following:

Title: Bayesian Inference in Support Vector Regression

Abstract: 
In this paper, we apply popular Bayesian techniques on support vector
regression. We describe a Bayesian framework in a function-space view with a
Gaussian process prior probability over the functions. A unified
non-quadratic loss function with the desirable characteristic of
differentiability, called the soft insensitive loss function, is used in
likelihood evaluation. In the framework, maximum a posteriori estimate of
the functions results in an extended support vector regression problem.
Bayesian methods are used to implement model adaptation, while keeping the
merits of support vector regression, such as quadratic programming and
sparseness. Moreover, we put forward confidence interval in making
predictions. Experimental results on simulated and real-world datasets
indicate that the approach works well even on large datasets.

Title: Bayesian Inference in Trigonometric Support Vector Classifier

Abstract: 
In the report, we propose a novel classifier, known as trigonometric support
vector classifier, to integrate popular Bayesian techniques with support
vector classifier. We describe a Bayesian framework in a function-space view
with a Gaussian process prior probability over the functions. The
trigonometric likelihood function with the desirable characteristics of
normalization in likelihood and differentiability is used in likelihood
evaluation. In the framework, maximum a posteriori estimate of the functions
results in an extended support vector classifier problem. Bayesian methods
are used to implement model adaptation, while keeping the merits of support
vector classifier, such as sparseness and convex programming. Moreover, we
put forward class probability in making predictions. Experimental results on
artificial and benchmark datasets indicate that the approach works well even
on large datasets. 

Sincerely
Wei Chu (engp9354 at nus.edu.sg)
S. Sathiya Keerthi (mpessk at nus.edu.sg)
Chong Jin Ong (mpeongcj at nus.edu.sg)





More information about the Connectionists mailing list