Robust Regression and Outlier Detection.

Author(s):  
P. J. Laycock ◽  
Peter J. Rousseuw ◽  
Annick M. Leroy
1995 ◽  
Vol 28 (1) ◽  
pp. 73-87 ◽  
Author(s):  
P. Vankeerberghen ◽  
J. Smeyers-Verbeke ◽  
R. Leardi ◽  
C.L. Karr ◽  
D.L. Massart

Author(s):  
Zhuang Qi ◽  
Dazhi Jiang ◽  
Xiaming Chen

In linear regression, outliers have a serious effect on the estimation of regression model parameters and the prediction of final results, so outlier detection is one of the key steps in data analysis. In this paper, we use a mean shift model and then we apply the penalty function to penalize the mean shift parameters, which is conducive to get a sparse parameter vector. We choose Sorted L1 regularization (SLOPE), which provides a convex loss function, and shows good statistical properties in parameter selection. We apply an iterative process which using gradient descent method and parameter selection at each step. Our algorithm has higher computational efficiency since the calculation of inverse matrix is avoided. Finally, we use Cross-Validation rules (CV) and Bayesian Information Criterion (BIC) criteria to fine tune the parameters, which helps our program identify outliers and obtain more robust regression coefficients. Compared with other methods, the experimental results show that our program has a fantastic performance in all aspects of outlier detection.


Author(s):  
A. H. Seheult ◽  
P. J. Green ◽  
P. J. Rousseeuw ◽  
A. M. Leroy

Sign in / Sign up

Export Citation Format

Share Document