Language:   Search:   Contact
Zentralblatt MATH has released its new interface!
For an improved author identification, see the new author database of ZBMATH.

Query:
Fill in the form and click »Search«...
Format:
Display: entries per page entries
Zbl 1075.68632
Practical selection of SVM parameters and noise estimation for SVM regression.
(English)
[J] Neural Netw. 17, No. 1, 113-126 (2004). ISSN 0893-6080

Summary: We investigate practical selection of hyper-parameters for support vector machines (SVM) regression (that is, $\epsilon$-insensitive zone and regularization parameter $C$). The proposed methodology advocates analytic parameter selection directly from the training data, rather than re-sampling approaches commonly used in SVM applications. In particular, we describe a new analytical prescription for setting the value of insensitive zone $\epsilon$, as a function of training sample size. Good generalization performance of the proposed parameter selection is demonstrated empirically using several low- and high-dimensional regression problems. Further, we point out the importance of Vapnik's $\epsilon$-insensitive loss for regression problems with finite samples. To this end, we compare generalization performance of SVM regression (using proposed selection of $\epsilon$-values) with regression using `least-modulus' loss ($\epsilon=0$) and standard squared loss. These comparisons indicate superior generalization performance of SVM regression under sparse sample settings, for various types of additive noise.
MSC 2000:

Keywords: Complexity control; Loss function; Parameter selection; Prediction accuracy; Support vector machine regression; VC theory

Highlights
Master Server