Least absolute deviation-based robust support vector regression

2017 ◽  
Vol 131 ◽  
pp. 183-194 ◽  
Author(s):  
Chuanfa Chen ◽  
Yanyan Li ◽  
Changqing Yan ◽  
Jinyun Guo ◽  
Guolin Liu
2014 ◽  
Vol 2014 ◽  
pp. 1-8
Author(s):  
Kuaini Wang ◽  
Jingjing Zhang ◽  
Yanyan Chen ◽  
Ping Zhong

Least squares support vector machine (LS-SVM) is a powerful tool for pattern classification and regression estimation. However, LS-SVM is sensitive to large noises and outliers since it employs the squared loss function. To solve the problem, in this paper, we propose an absolute deviation loss function to reduce the effects of outliers and derive a robust regression model termed as least absolute deviation support vector regression (LAD-SVR). The proposed loss function is not differentiable. We approximate it by constructing a smooth function and develop a Newton algorithm to solve the robust model. Numerical experiments on both artificial datasets and benchmark datasets demonstrate the robustness and effectiveness of the proposed method.


2017 ◽  
Vol 11 (8) ◽  
pp. 92
Author(s):  
Waleed Dhhan ◽  
Habshah Midi ◽  
Thaera Alameer

Support vector regression is used to evaluate the linear and non-linear relationships among variables. Although it is non-parametric technique, it is still affected by outliers, because the possibility to select them as support vectors. In this article, we proposed a robust support vector regression for linear and nonlinear target functions. In order to carry out this goal, the support vector regression model with fixed parameters is used to detect and minimize the effects of abnormal points in the data set. The efficiency of the proposed method is investigated by using real and simulation examples.


2016 ◽  
Vol 67 (5) ◽  
pp. 735-742 ◽  
Author(s):  
Dohyun Kim ◽  
Chungmok Lee ◽  
Sangheum Hwang ◽  
Myong K Jeong

Sign in / Sign up

Export Citation Format

Share Document