Neural Information Processing - Letters and Reviews

Vol. 10, No.10, October 2007

pp. 203-224

Support Vector Regression

Debasish Basak
Electrical Laboratory,
Central Institute of Mining and Fuel Research,
Barwa Road, Dhanbad-826001, INDIA
E-mail: deba65@yahoo.com

Srimanta Pal
Electronics & Communication Sciences Unit,
Indian Statistical Institute, 203 B.T. Road,
Kolkata- 700108, INDIA
E-mail: srimanta@isical.ac.in

Dipak Chandra Patranabis
Department of Instrumentation and Electronics Engineering,
Jadavpur University, Salt Lake Campus, Kolkata – 700098, INDIA
Heritage Institute of Technology, Kolkata – 700107, INDIA
E-mail: dcp@iee.jusl.ac.in

Abstract

Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance. The idea of SVR is based on the computation of a linear regression function in a high dimensional feature space where the input data are mapped via a nonlinear function. SVR has been applied in various fields – time series and financial (noisy and risky) prediction, approximation of complex engineering analyses, convex quadratic programming and choices of loss functions, etc. In this paper, an attempt has been made to review the existing theory, methods, recent developments and scopes of SVR.

Keywords - SVR, Ridge regression, Kernel methods, QP