localized generalization error
Recently Published Documents


TOTAL DOCUMENTS

43
(FIVE YEARS 2)

H-INDEX

11
(FIVE YEARS 0)

2014 ◽  
Vol 146 ◽  
pp. 104-112 ◽  
Author(s):  
Wing W.Y. Ng ◽  
Xue-Ling Liang ◽  
Jincheng Li ◽  
Daniel S. Yeung ◽  
Patrick P.K. Chan

2014 ◽  
Vol 27 (1) ◽  
pp. 59-66 ◽  
Author(s):  
Qiang Liu ◽  
Jianping Yin ◽  
Victor C. M. Leung ◽  
Jun-Hai Zhai ◽  
Zhiping Cai ◽  
...  

Author(s):  
BINBIN SUN ◽  
WING W. Y. NG ◽  
DANIEL S. YEUNG ◽  
PATRICK P. K. CHAN

Sparse LS-SVM yields better generalization capability and reduces prediction time in comparison to full dense LS-SVM. However, both methods require careful selection of hyper-parameters (HPS) to achieve high generalization capability. Leave-One-Out Cross Validation (LOO-CV) and k-fold Cross Validation (k-CV) are the two most widely used hyper-parameter selection methods for LS-SVMs. However, both fail to select good hyper-parameters for sparse LS-SVM. In this paper we propose a new hyper-parameter selection method, LGEM-HPS, for LS-SVM via minimization of the Localized Generalization Error (L-GEM). The L-GEM consists of two major components: empirical mean square error and sensitivity measure. A new sensitivity measure is derived for LS-SVM to enable the LGEM-HPS select hyper-parameters yielding LS-SVM with smaller training error and minimum sensitivity to minor changes in inputs. Experiments on eleven UCI data sets show the effectiveness of the proposed method for selecting hyper-parameters for sparse LS-SVM.


Sign in / Sign up

Export Citation Format

Share Document