Support vector regression with the weighted absolute deviation error loss function

2018 ◽  
Vol 29 (6) ◽  
pp. 1707-1719
Author(s):  
Kang-Mo Jung
2014 ◽  
Vol 2014 ◽  
pp. 1-8
Author(s):  
Kuaini Wang ◽  
Jingjing Zhang ◽  
Yanyan Chen ◽  
Ping Zhong

Least squares support vector machine (LS-SVM) is a powerful tool for pattern classification and regression estimation. However, LS-SVM is sensitive to large noises and outliers since it employs the squared loss function. To solve the problem, in this paper, we propose an absolute deviation loss function to reduce the effects of outliers and derive a robust regression model termed as least absolute deviation support vector regression (LAD-SVR). The proposed loss function is not differentiable. We approximate it by constructing a smooth function and develop a Newton algorithm to solve the robust model. Numerical experiments on both artificial datasets and benchmark datasets demonstrate the robustness and effectiveness of the proposed method.


2017 ◽  
Vol 131 ◽  
pp. 183-194 ◽  
Author(s):  
Chuanfa Chen ◽  
Yanyan Li ◽  
Changqing Yan ◽  
Jinyun Guo ◽  
Guolin Liu

Sign in / Sign up

Export Citation Format

Share Document