A new version of unbiased ridge regression estimator under the stochastic restricted linear regression model

Author(s):  
Mustafa I. Alheety ◽  
B. M. Golam Kibria
2021 ◽  
Vol 26 (2) ◽  
Author(s):  
Bader Aboud ◽  
Mustafa Ismaeel Naif

In the linear regression model, the restricted biased estimation as one of important  methods to addressing the high variance and the  multicollinearity problems. In this paper, we make the simulation study of the some restricted biased estimators. The mean square error (MME) criteria are used to make a comparison  among them. According to the simulation study we observe that, the performance of the restricted modified unbiased  ridge regression estimator (RMUR) was proposed by  Bader and Alheety (2020)  is better than  of these estimators. Numerical example have been considered to illustrate the performance of the estimators.


2020 ◽  
Vol 2020 ◽  
pp. 1-17
Author(s):  
Adewale F. Lukman ◽  
B. M. Golam Kibria ◽  
Kayode Ayinde ◽  
Segun L. Jegede

Motivated by the ridge regression (Hoerl and Kennard, 1970) and Liu (1993) estimators, this paper proposes a modified Liu estimator to solve the multicollinearity problem for the linear regression model. This modification places this estimator in the class of the ridge and Liu estimators with a single biasing parameter. Theoretical comparisons, real-life application, and simulation results show that it consistently dominates the usual Liu estimator. Under some conditions, it performs better than the ridge regression estimators in the smaller MSE sense. Two real-life data are analyzed to illustrate the findings of the paper and the performances of the estimators assessed by MSE and the mean squared prediction error. The application result agrees with the theoretical and simulation results.


Scientifica ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-16 ◽  
Author(s):  
B. M. Golam Kibria ◽  
Adewale F. Lukman

The ridge regression-type (Hoerl and Kennard, 1970) and Liu-type (Liu, 1993) estimators are consistently attractive shrinkage methods to reduce the effects of multicollinearity for both linear and nonlinear regression models. This paper proposes a new estimator to solve the multicollinearity problem for the linear regression model. Theory and simulation results show that, under some conditions, it performs better than both Liu and ridge regression estimators in the smaller MSE sense. Two real-life (chemical and economic) data are analyzed to illustrate the findings of the paper.


2021 ◽  
Vol 2099 (1) ◽  
pp. 012024
Author(s):  
V N Lutay ◽  
N S Khusainov

Abstract This paper discusses constructing a linear regression model with regularization of the system matrix of normal equations. In contrast to the conventional ridge regression, where positive parameters are added to all diagonal terms of a matrix, in the method proposed only those matrix diagonal entries that correspond to the data with a high correlation are increased. This leads to a decrease in the matrix conditioning and, therefore, to a decrease in the corresponding coefficients of the regression equation. The selection of the entries to be increased is based on the triangular decomposition of the correlation matrix of the original dataset. The effectiveness of the method is tested on a known dataset, and it is performed not only with a ridge regression, but also with the results of applying the widespread algorithms LARS and Lasso.


2009 ◽  
Vol 15 (53) ◽  
pp. 1
Author(s):  
حازم منصور كوركيس

In this paper the method of singular value decomposition  is used to estimate the ridge parameter of ridge regression estimator which is an alternative to ordinary least squares estimator when the general linear regression model suffer from near multicollinearity.


Sign in / Sign up

Export Citation Format

Share Document