Basak, D. (2007) Support Vector Regression. International Journal of Neural Information Processing – Letters and Reviews, 11 (10). pp. 203-224.
PDF
Restricted to Registered users only Download (274Kb) | Request a copy |
Abstract
Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance. The idea of SVR is based on the computation of a linear regression function in a high dimensional feature space where the input data are mapped via a nonlinear function. SVR has been applied in various fields – time series and financial (noisy and risky) prediction, approximation of complex engineering analyses, convex quadratic programming and choices of loss functions, etc. In this paper, an attempt has been made to review the existing theory, methods, recent developments and scopes of SVR.
Item Type: | Article |
---|---|
Subjects: | Electrical Testing |
Divisions: | UNSPECIFIED |
Depositing User: | Dr. Satyendra Kumar Singh |
Date Deposited: | 18 Nov 2011 13:00 |
Last Modified: | 21 Jan 2012 10:02 |
URI: | http://cimfr.csircentral.net/id/eprint/38 |
Actions (login required)
View Item |