scholarly journals Inference robust to outliers with ℓ1-norm penalization

2020 ◽  
Vol 24 ◽  
pp. 688-702
Author(s):  
Jad Beyhum

This paper considers inference in a linear regression model with outliers in which the number of outliers can grow with sample size while their proportion goes to 0. We propose a square-root lasso ℓ1-norm penalized estimator. We derive rates of convergence and establish asymptotic normality. Our estimator has the same asymptotic variance as the OLS estimator in the standard linear model. This enables us to build tests and confidence sets in the usual and simple manner. The proposed procedure is also computationally advantageous, it amounts to solving a convex optimization program. Overall, the suggested approach offers a practical robust alternative to the ordinary least squares estimator.

2020 ◽  
Vol 2020 ◽  
pp. 1-24
Author(s):  
Adewale F. Lukman ◽  
Kayode Ayinde ◽  
B. M. Golam Kibria ◽  
Segun L. Jegede

The general linear regression model has been one of the most frequently used models over the years, with the ordinary least squares estimator (OLS) used to estimate its parameter. The problems of the OLS estimator for linear regression analysis include that of multicollinearity and outliers, which lead to unfavourable results. This study proposed a two-parameter ridge-type modified M-estimator (RTMME) based on the M-estimator to deal with the combined problem resulting from multicollinearity and outliers. Through theoretical proofs, Monte Carlo simulation, and a numerical example, the proposed estimator outperforms the modified ridge-type estimator and some other considered existing estimators.


2009 ◽  
Vol 15 (53) ◽  
pp. 1
Author(s):  
حازم منصور كوركيس

In this paper the method of singular value decomposition  is used to estimate the ridge parameter of ridge regression estimator which is an alternative to ordinary least squares estimator when the general linear regression model suffer from near multicollinearity.


2009 ◽  
Vol 25 (1) ◽  
pp. 298-301 ◽  
Author(s):  
Sung Jae Jun ◽  
Joris Pinkse

It is well known that in standard linear regression models with independent and identically distributed data and homoskedasticity, adding “irrelevant regressors” hurts (asymptotic) efficiency unless such irrelevant regressors are orthogonal to the remaining regressors. But we have found that under (conditional) heteroskedasticity “irrelevant regressors” can always be found such that one can achieve the asymptotic variance of the generalized least squares estimator by adding the “irrelevant regressors” to the model.


2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Jibo Wu

We introduce an unbiased two-parameter estimator based on prior information and two-parameter estimator proposed by Özkale and Kaçıranlar, 2007. Then we discuss its properties and our results show that the new estimator is better than the two-parameter estimator, the ordinary least squares estimator, and explain the almost unbiased two-parameter estimator which is proposed by Wu and Yang, 2013. Finally, we give a simulation study to show the theoretical results.


1982 ◽  
Vol 19 (A) ◽  
pp. 225-239 ◽  
Author(s):  
C. R. Heathcote

The standard linear regression model is analysed using a method called functional least squares which yields a family of estimators for the slope parameter indexed by a real variable t, |t| ≦ T. The choice t = 0 corresponds to ordinary least squares, non-zero values being appropriate if the error distribution is long-tailed, and it is argued that the approach is a natural extension of least squares methodology. It emerges that the asymptotic normal distribution of these estimators has a covariance matrix characterised by a scalar function of t, called the variance function, which is determined by the error distribution. Properties of this variance function suggest graphical criteria for detecting departures from normality.


Stats ◽  
2020 ◽  
Vol 3 (4) ◽  
pp. 526-541
Author(s):  
Issam Dawoud ◽  
B. M. Golam Kibria

In a multiple linear regression model, the ordinary least squares estimator is inefficient when the multicollinearity problem exists. Many authors have proposed different estimators to overcome the multicollinearity problem for linear regression models. This paper introduces a new regression estimator, called the Dawoud–Kibria estimator, as an alternative to the ordinary least squares estimator. Theory and simulation results show that this estimator performs better than other regression estimators under some conditions, according to the mean squares error criterion. The real-life datasets are used to illustrate the findings of the paper.


Author(s):  
Masayuki Hirukawa ◽  
Di Liu ◽  
Artem Prokhorov

Economists often use matched samples, especially when dealing with earning data where some observations are missing in one sample and need to be imputed from another sample. Hirukawa and Prokhorov (2018, Journal of Econometrics 203: 344–358) show that the ordinary least-squares estimator using matched samples is inconsistent and propose two consistent estimators. We describe a new command, msreg, that implements these two consistent estimators based on two samples. The estimators attain the parametric convergence rate if the number of continuous matching variables is no greater than four.


1982 ◽  
Vol 19 (A) ◽  
pp. 225-239 ◽  
Author(s):  
C. R. Heathcote

The standard linear regression model is analysed using a method called functional least squares which yields a family of estimators for the slope parameter indexed by a real variable t, |t| ≦ T. The choice t = 0 corresponds to ordinary least squares, non-zero values being appropriate if the error distribution is long-tailed, and it is argued that the approach is a natural extension of least squares methodology. It emerges that the asymptotic normal distribution of these estimators has a covariance matrix characterised by a scalar function of t, called the variance function, which is determined by the error distribution. Properties of this variance function suggest graphical criteria for detecting departures from normality.


Sign in / Sign up

Export Citation Format

Share Document