scholarly journals Feasible Generalized Stein-Rule Restricted Ridge Regression Estimators

2017 ◽  
Vol 13 (1) ◽  
pp. 77-97
Author(s):  
Nimet Özbay ◽  
Issam Dawoud ◽  
Selahattin Kaçıranlar

Abstract Several versions of the Stein-rule estimators of the coefficient vector in a linear regression model are proposed in the literature. In the present paper, we propose new feasible generalized Stein-rule restricted ridge regression estimators to examine multicollinearity and autocorrelation problems simultaneously for the general linear regression model, when certain additional exact restrictions are placed on these coefficients. Moreover, a Monte Carlo simulation experiment is performed to investigate the performance of the proposed estimator over the others.

2020 ◽  
Vol 2020 ◽  
pp. 1-24
Author(s):  
Adewale F. Lukman ◽  
Kayode Ayinde ◽  
B. M. Golam Kibria ◽  
Segun L. Jegede

The general linear regression model has been one of the most frequently used models over the years, with the ordinary least squares estimator (OLS) used to estimate its parameter. The problems of the OLS estimator for linear regression analysis include that of multicollinearity and outliers, which lead to unfavourable results. This study proposed a two-parameter ridge-type modified M-estimator (RTMME) based on the M-estimator to deal with the combined problem resulting from multicollinearity and outliers. Through theoretical proofs, Monte Carlo simulation, and a numerical example, the proposed estimator outperforms the modified ridge-type estimator and some other considered existing estimators.


Scientifica ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-16 ◽  
Author(s):  
B. M. Golam Kibria ◽  
Adewale F. Lukman

The ridge regression-type (Hoerl and Kennard, 1970) and Liu-type (Liu, 1993) estimators are consistently attractive shrinkage methods to reduce the effects of multicollinearity for both linear and nonlinear regression models. This paper proposes a new estimator to solve the multicollinearity problem for the linear regression model. Theory and simulation results show that, under some conditions, it performs better than both Liu and ridge regression estimators in the smaller MSE sense. Two real-life (chemical and economic) data are analyzed to illustrate the findings of the paper.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Jibo Wu ◽  
Chaolin Liu

This paper considers several estimators for estimating the stochastic restricted ridge regression estimators. A simulation study has been conducted to compare the performance of the estimators. The result from the simulation study shows that stochastic restricted ridge regression estimators outperform mixed estimator. A numerical example has been also given to illustrate the performance of the estimators.


2020 ◽  
Vol 2020 ◽  
pp. 1-17
Author(s):  
Adewale F. Lukman ◽  
B. M. Golam Kibria ◽  
Kayode Ayinde ◽  
Segun L. Jegede

Motivated by the ridge regression (Hoerl and Kennard, 1970) and Liu (1993) estimators, this paper proposes a modified Liu estimator to solve the multicollinearity problem for the linear regression model. This modification places this estimator in the class of the ridge and Liu estimators with a single biasing parameter. Theoretical comparisons, real-life application, and simulation results show that it consistently dominates the usual Liu estimator. Under some conditions, it performs better than the ridge regression estimators in the smaller MSE sense. Two real-life data are analyzed to illustrate the findings of the paper and the performances of the estimators assessed by MSE and the mean squared prediction error. The application result agrees with the theoretical and simulation results.


2008 ◽  
Vol 31 (1) ◽  
pp. 71-79 ◽  
Author(s):  
Robert W. Simmons ◽  
Andrew D. Noble ◽  
P. Pongsakul ◽  
O. Sukreeyapongse ◽  
N. Chinabut

2021 ◽  
Vol 2099 (1) ◽  
pp. 012024
Author(s):  
V N Lutay ◽  
N S Khusainov

Abstract This paper discusses constructing a linear regression model with regularization of the system matrix of normal equations. In contrast to the conventional ridge regression, where positive parameters are added to all diagonal terms of a matrix, in the method proposed only those matrix diagonal entries that correspond to the data with a high correlation are increased. This leads to a decrease in the matrix conditioning and, therefore, to a decrease in the corresponding coefficients of the regression equation. The selection of the entries to be increased is based on the triangular decomposition of the correlation matrix of the original dataset. The effectiveness of the method is tested on a known dataset, and it is performed not only with a ridge regression, but also with the results of applying the widespread algorithms LARS and Lasso.


Sign in / Sign up

Export Citation Format

Share Document