On the generalization ability of support vector machines

by    I. Steinwart

Preprint series: 01-08, Reports on Analysis

I. Steinwart

Preprint series: , Reports on Analysisand Computer Science

MSC:
68T10 Pattern recognition, speech recognition, {For cluster analysis, See 62H30}

Abstract: In this article we study the generalization abilities
of the 1-norm soft margin classifier. We show for several
standard kernels, like the Gaussian RBF kernel, that this
algorithm yields arbitrarily good generalization results
provided that the factor which weighs the sum of the slack
variables is chosen well.
This kind of result is completely new. Indeed, for the first
time it can now be explained without a-priori assumptions on
the classification problem why the support vector approach
may provide good generalization performance. Our
considerations are firstly based on an approximation
property of the used kernels which also gives new insight
into the role of kernels in these and other algorithms. Thus
it may also be of independent interest. Secondly, the result
is archieved by a precise investigation of the optimization
problem which underlies the 1-norm soft margin algorithm.

Keywords: Computational learning theory, pattern recognition, PAC model, maximal margin hyperplane, support vector machines, kernel methods

Upload: 2001-05-11

Update: 2001-06-21


The author(s) agree, that this abstract may be stored as full text and distributed as such by abstracting services.