scholarly journals The selective regularization of a linear regression model

2021 ◽  
Vol 2099 (1) ◽  
pp. 012024
Author(s):  
V N Lutay ◽  
N S Khusainov

Abstract This paper discusses constructing a linear regression model with regularization of the system matrix of normal equations. In contrast to the conventional ridge regression, where positive parameters are added to all diagonal terms of a matrix, in the method proposed only those matrix diagonal entries that correspond to the data with a high correlation are increased. This leads to a decrease in the matrix conditioning and, therefore, to a decrease in the corresponding coefficients of the regression equation. The selection of the entries to be increased is based on the triangular decomposition of the correlation matrix of the original dataset. The effectiveness of the method is tested on a known dataset, and it is performed not only with a ridge regression, but also with the results of applying the widespread algorithms LARS and Lasso.

2020 ◽  
Vol 2020 ◽  
pp. 1-17
Author(s):  
Adewale F. Lukman ◽  
B. M. Golam Kibria ◽  
Kayode Ayinde ◽  
Segun L. Jegede

Motivated by the ridge regression (Hoerl and Kennard, 1970) and Liu (1993) estimators, this paper proposes a modified Liu estimator to solve the multicollinearity problem for the linear regression model. This modification places this estimator in the class of the ridge and Liu estimators with a single biasing parameter. Theoretical comparisons, real-life application, and simulation results show that it consistently dominates the usual Liu estimator. Under some conditions, it performs better than the ridge regression estimators in the smaller MSE sense. Two real-life data are analyzed to illustrate the findings of the paper and the performances of the estimators assessed by MSE and the mean squared prediction error. The application result agrees with the theoretical and simulation results.


Scientifica ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-16 ◽  
Author(s):  
B. M. Golam Kibria ◽  
Adewale F. Lukman

The ridge regression-type (Hoerl and Kennard, 1970) and Liu-type (Liu, 1993) estimators are consistently attractive shrinkage methods to reduce the effects of multicollinearity for both linear and nonlinear regression models. This paper proposes a new estimator to solve the multicollinearity problem for the linear regression model. Theory and simulation results show that, under some conditions, it performs better than both Liu and ridge regression estimators in the smaller MSE sense. Two real-life (chemical and economic) data are analyzed to illustrate the findings of the paper.


2021 ◽  
Vol 20 (3) ◽  
pp. 425-449
Author(s):  
Haruka Murayama ◽  
Shota Saito ◽  
Yuji Iikubo ◽  
Yuta Nakahara ◽  
Toshiyasu Matsushima

AbstractPrediction based on a single linear regression model is one of the most common way in various field of studies. It enables us to understand the structure of data, but might not be suitable to express the data whose structure is complex. To express the structure of data more accurately, we make assumption that the data can be divided in clusters, and has a linear regression model in each cluster. In this case, we can assume that each explanatory variable has their own role; explaining the assignment to the clusters, explaining the regression to the target variable, or being both of them. Introducing probabilistic structure to the data generating process, we derive the optimal prediction under Bayes criterion and the algorithm which calculates it sub-optimally with variational inference method. One of the advantages of our algorithm is that it automatically weights the probabilities of being each number of clusters in the process of the algorithm, therefore it solves the concern about selection of the number of clusters. Some experiments are performed on both synthetic and real data to demonstrate the above advantages and to discover some behaviors and tendencies of the algorithm.


2017 ◽  
Vol 13 (1) ◽  
pp. 77-97
Author(s):  
Nimet Özbay ◽  
Issam Dawoud ◽  
Selahattin Kaçıranlar

Abstract Several versions of the Stein-rule estimators of the coefficient vector in a linear regression model are proposed in the literature. In the present paper, we propose new feasible generalized Stein-rule restricted ridge regression estimators to examine multicollinearity and autocorrelation problems simultaneously for the general linear regression model, when certain additional exact restrictions are placed on these coefficients. Moreover, a Monte Carlo simulation experiment is performed to investigate the performance of the proposed estimator over the others.


Sign in / Sign up

Export Citation Format

Share Document