scholarly journals Least Absolute Deviation Support Vector Regression

2014 ◽  
Vol 2014 ◽  
pp. 1-8
Author(s):  
Kuaini Wang ◽  
Jingjing Zhang ◽  
Yanyan Chen ◽  
Ping Zhong

Least squares support vector machine (LS-SVM) is a powerful tool for pattern classification and regression estimation. However, LS-SVM is sensitive to large noises and outliers since it employs the squared loss function. To solve the problem, in this paper, we propose an absolute deviation loss function to reduce the effects of outliers and derive a robust regression model termed as least absolute deviation support vector regression (LAD-SVR). The proposed loss function is not differentiable. We approximate it by constructing a smooth function and develop a Newton algorithm to solve the robust model. Numerical experiments on both artificial datasets and benchmark datasets demonstrate the robustness and effectiveness of the proposed method.

2017 ◽  
Vol 131 ◽  
pp. 183-194 ◽  
Author(s):  
Chuanfa Chen ◽  
Yanyan Li ◽  
Changqing Yan ◽  
Jinyun Guo ◽  
Guolin Liu

2021 ◽  
Vol 19 (3) ◽  
pp. 85-109
Author(s):  
Jingying Lin ◽  
Caio Almeida

Pricing American options accurately is of great theoretical and practical importance. We propose using machine learning methods, including support vector regression and classification and regression trees. These more advanced techniques extend the traditional Longstaff-Schwartz approach, replacing the OLS regression step in the Monte Carlo simulation. We apply our approach to both simulated data and market data from the S&P 500 Index option market in 2019. Our results suggest that support vector regression can be an alternative to the existing OLS-based pricing method, requiring fewer simulations and reducing the vulnerability to misspecification of basis functions.


Sign in / Sign up

Export Citation Format

Share Document