Learning with Rigorous Support Vector Machines


Learning with Rigorous Support Vector Machines

Jinbo Bi
Rensselaer Polytechnic Institute

Vladimir N. Vapnik
NEC Labs America, Inc.

Abstract. We examine the so-called rigorous support vector machine (RSVM) approach proposed by Vapnik (1998). The formulation of RSVM is derived by explicitly implementing the structural risk minimization principle with a parameter H used to directly control the VC dimension of the set of separating hyperplanes. By optimizing the dual problem, RSVM finds the optimal separating hyperplane from a set of functions with VC dimension approximate to H^2+1. RSVM produces classifiers equivalent to those obtained by classic SVMs for appropriate parameter choices, but the use of the parameter H facilitates model selection, thus minimizing VC bounds on the generalization risk more effectively. In our empirical studies, good models are achieved for an appropriate H^2 in [5% L, 30% L] where L is the size of training data.

The RSVM package written in C++

Welcome any bug report and appreciate it if someone could implement a more efficient solver for RSVM and inform me of that. This paper has been accepted by COLT 2003.

Contact Jinbo Bi (jinbo@engr.uconn.edu) for information about this page.