scholarly journals Using Robust Ridge Regression Diagnostic Method to Handle Multicollinearity Caused High Leverage Points

2021 ◽  
Vol 10 (1) ◽  
pp. 326
Author(s):  
Kafi Dano Pati

Statistics practitioners have been depending on the ordinary least squares (OLS) method in the linear regression model for generation because of its optimal properties and simplicity of calculation. However, the OLS estimators can be strongly affected by the existence of multicollinearity which is a near linear dependency between two or more independent variables in the regression model. Even though in the presence of multicollinearity the OLS estimate still remained unbiased, they will be inaccurate prediction about the dependent variable with the inflated standard errors of the estimated parameter coefficient of the regression model. It is now evident that the existence of high leverage points which are the outliers in x-direction are the prime factor of collinearity influential observations. In this paper, we proposed some alternative to regression methods for estimating the regression parameter coefficient in the presence of multiple high leverage points which cause the multicollinearity problem. This procedure utilized the ordinary least squares estimates of the parameter as the initial followed by an estimate of the ridge regression. We incorporated the Least Trimmed Squares (LTS) robust regression estimate to down weight the effects of multiple high leverage points which lead to the reduction of the effects of multicollinearity. The result seemed to suggest that the RLTS give a substantial improvement over the Ridge Regression.

2019 ◽  
Vol 8 (1) ◽  
pp. 24-34
Author(s):  
Eka Destiyani ◽  
Rita Rahmawati ◽  
Suparti Suparti

The Ordinary Least Squares (OLS) is one of the most commonly used method to estimate linear regression parameters. If multicollinearity is exist within predictor variables especially coupled with the outliers, then regression analysis with OLS is no longer used. One method that can be used to solve a multicollinearity and outliers problems is Ridge Robust-MM Regression. Ridge Robust-MM  Regression is a modification of the Ridge Regression method based on the MM-estimator of Robust Regression. The case study in this research is AKB in Central Java 2017 influenced by population dencity, the precentage of households behaving in a clean and healthy life, the number of low-weighted baby born, the number of babies who are given exclusive breastfeeding, the number of babies that receiving a neonatal visit once, and the number of babies who get health services. The result of estimation using OLS show that there is violation of multicollinearity and also the presence of outliers. Applied ridge robust-MM regression to case study proves ridge robust regression can improve parameter estimation. Based on t test at 5% significance level most of predictor variables have significant effect to variable AKB. The influence value of predictor variables to AKB is 47.68% and MSE value is 0.01538.Keywords:  Ordinary  Least  Squares  (OLS),  Multicollinearity,  Outliers,  RidgeRegression, Robust Regression, AKB.


2014 ◽  
Vol 71 (1) ◽  
Author(s):  
Bello Abdulkadir Rasheed ◽  
Robiah Adnan ◽  
Seyed Ehsan Saffari ◽  
Kafi Dano Pati

In a linear regression model, the ordinary least squares (OLS) method is considered the best method to estimate the regression parameters if the assumptions are met. However, if the data does not satisfy the underlying assumptions, the results will be misleading. The violation for the assumption of constant variance in the least squares regression is caused by the presence of outliers and heteroscedasticity in the data. This assumption of constant variance (homoscedasticity) is very important in linear regression in which the least squares estimators enjoy the property of minimum variance. Therefor e robust regression method is required to handle the problem of outlier in the data. However, this research will use the weighted least square techniques to estimate the parameter of regression coefficients when the assumption of error variance is violated in the data. Estimation of WLS is the same as carrying out the OLS in a transformed variables procedure. The WLS can easily be affected by outliers. To remedy this, We have suggested a strong technique for the estimation of regression parameters in the existence of heteroscedasticity and outliers. Here we apply the robust regression of M-estimation using iterative reweighted least squares (IRWLS) of Huber and Tukey Bisquare function and resistance regression estimator of least trimmed squares to estimating the model parameters of state-wide crime of united states in 1993. The outcomes from the study indicate the estimators obtained from the M-estimation techniques and the least trimmed method are more effective compared with those obtained from the OLS.


2018 ◽  
Vol 7 (4.10) ◽  
pp. 543
Author(s):  
B. Mahaboob ◽  
B. Venkateswarlu ◽  
C. Narayana ◽  
J. Ravi sankar ◽  
P. Balasiddamuni

This research article uses Matrix Calculus techniques to study least squares application of nonlinear regression model, sampling distributions of nonlinear least squares estimators of regression parametric vector and error variance and testing of general nonlinear hypothesis on parameters of nonlinear regression model. Arthipova Irina et.al [1], in this paper, discussed some examples of different nonlinear models and the application of OLS (Ordinary Least Squares). MA Tabati et.al (2), proposed a robust alternative technique to OLS nonlinear regression method which provide accurate parameter estimates when outliers and/or influential observations are present. Xu Zheng et.al [3] presented new parametric tests for heteroscedasticity in nonlinear and nonparametric models.  


1984 ◽  
Vol 21 (3) ◽  
pp. 268-277 ◽  
Author(s):  
Vijay Mahajan ◽  
Subhash Sharma ◽  
Yoram Wind

In marketing models, the presence of aberrant response values or outliers in data can distort the parameter estimates or regression coefficients obtained by means of ordinary least squares. The authors demonstrate the potential usefulness of the robust regression analysis in treating influential response values in marketing data.


Author(s):  
A. J. Rook ◽  
M. Gill ◽  
M. S. Dhanoa

Due to collinearity among the independent varlates, intake prediction models based on least squares multiple regression are likely to predict poorly with independent data. In addition, the regression coefficients are sensitive to small changes in the estimation data and tend not to reflect causal relationships expected from the results of controlled experimentation. Ridge regression (Hoerl and Kennard, 1970) allows the estimation of new coefficients for the independent variables which overcome these effects of collinearity. In order to assess the usefulness of the method for Intake prediction, ordinary least squares (OLS) models, obtained using backward elimination of variables, and ridge regression models were constructed from the same data and then tested with independent data.Estimation data consisted of results of experiments of IGAP, Hurley and Greenmount College of Agriculture in which growing cattle were individually fed grass silage ad-libitum with or without supplementary feeds. Two subsets of the estimation data were used. Subset A included 395 animals and 36 silages; subset B included 192 animals and 16 silages and was for Hurley data only.


2021 ◽  
Vol 48 (3) ◽  
Author(s):  
Shokrya Saleh Alshqaq ◽  

The least trimmed squares (LTS) estimation has been successfully used in the robust linear regression models. This article extends the LTS estimation to the Jammalamadaka and Sarma (JS) circular regression model. The robustness of the proposed estimator is studied and the used algorithm for computation is discussed. Simulation studied, and real data show that the proposed robust circular estimator effectively fits JS circular models in the presence of vertical outliers and leverage points.


1977 ◽  
Vol 14 (4) ◽  
pp. 586-591 ◽  
Author(s):  
Vijay Mahajan ◽  
Arun K. Jain ◽  
Michel Bergier

In the presence of multicollinearity in data, the estimation of parameters or regression coefficients in marketing models by means of ordinary least squares may give inflated estimates with a high variance and wrong signs. The authors demonstrate the potential usefulness of the ridge regression analysis to handle multicollinearity in marketing data.


Sign in / Sign up

Export Citation Format

Share Document