Support vector machines and regularization

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Recently, there has been a growing interest in Statistical Learning Theory, aka VC theory, due to many successful applications of Support Vector Machines (SVMs). Even though most theoretical results in VC-theory (including all main concepts underlying SVM methodology) have been developed over 25 years ago, these concepts are occasionally misunderstood in the research community. This paper compares standard SVM regression and the regularization for learning dependencies from data. We point out that SVM approach has been developed in VC-theory under risk minimization approach, whereas the regularization approach has been developed under function approximation setting. This distinction is especially important since regularization-based learning is often presented as a purely constructive methodology (with no clearly stated problem setting), even though original regularization theory has been introduced under clearly stated function approximation setting. Further, we present empirical comparisons illustrating the effect of different mechanisms for complexity control (i.e., ε-insensitive loss vs standard ridge regression) on the generalization performance, under very simple settings using synthetic data sets. These comparisons suggest that the SVM approach to complexity control (via ε-loss) is more appropriate for learning under sparse high-dimensional settings.

Original languageEnglish (US)
Title of host publicationProceedings of the Seventh IASTED International Conference on Signal and Image Processing, SIP 2005
EditorsM.W. Marcellin
Pages166-171
Number of pages6
StatePublished - 2005
EventSeventh IASTED International Conference on Signal and Image Processing, SIP 2005 - Honolulu, HI, United States
Duration: Aug 15 2005Aug 17 2005

Publication series

NameProceedings of the Seventh IASTED International Conference on Signal and Image Processing, SIP 2005

Other

OtherSeventh IASTED International Conference on Signal and Image Processing, SIP 2005
Country/TerritoryUnited States
CityHonolulu, HI
Period8/15/058/17/05

Keywords

  • Function approximation
  • Regularization
  • Structural risk minimization

Fingerprint

Dive into the research topics of 'Support vector machines and regularization'. Together they form a unique fingerprint.

Cite this