scholarly journals A Comparative Analysis on Some Estimators of Parameters of Linear Regression Models in Presence of Multicollinearity

Author(s):  
Warha, Abdulhamid Audu ◽  
Yusuf Abbakar Muhammad ◽  
Akeyede, Imam

Linear regression is the measure of relationship between two or more variables known as dependent and independent variables. Classical least squares method for estimating regression models consist of minimising the sum of the squared residuals. Among the assumptions of Ordinary least squares method (OLS) is that there is no correlations (multicollinearity) between the independent variables. Violation of this assumptions arises most often in regression analysis and can lead to inefficiency of the least square method. This study, therefore, determined the efficient estimator between Least Absolute Deviation (LAD) and Weighted Least Square (WLS) in multiple linear regression models at different levels of multicollinearity in the explanatory variables. Simulation techniques were conducted using R Statistical software, to investigate the performance of the two estimators under violation of assumptions of lack of multicollinearity. Their performances were compared at different sample sizes. Finite properties of estimators’ criteria namely, mean absolute error, absolute bias and mean squared error were used for comparing the methods. The best estimator was selected based on minimum value of these criteria at a specified level of multicollinearity and sample size. The results showed that, LAD was the best at different levels of multicollinearity and was recommended as alternative to OLS under this condition. The performances of the two estimators decreased when the levels of multicollinearity was increased.

Author(s):  
Aditio Putra G ◽  
Muhammad Arif Tiro ◽  
Muhammad Kasim Aidid

Abstrak Metode kuadrat terkecil merupakan metode standar untuk mengestimasi nilai parameter model regresi linear. Metode tersebut dibangun berdasarkan asumsi error bersifat identik dan independen, serta berdistribusi normal. Apabila asumsi tidak terpenuhi maka metode ini tidak akurat. Alternatif untuk mengatasi hal tersebut adalah dengan menggunakan metode resampling. Adapun metode resampling yang digunakan dalam penelitian ini yaitu metode bootstrap dan Jackknife. Terlebih dahulu dilakukan estimasi nilai parameter regresi untuk analisis data kemiskinan Kota Makassar Tahun 2017. Data tersebut merupakan data sekunder diperoleh dari BAPPEDA Kota Makassar. Dari uji asumsi klasik diperoleh bahwa model tidak bersifat homoskedastis dan residual tidak berdistribusi normal sehingga model regresi yang diperoleh tidak dapat dipertanggungjawabkan. Metode bootstrap dan jackknife yang dikenalkan disini menggunakan program R untuk mencari nilai bias dan nilai standar errornya. Estimasi parameter model regresi linear berganda dari metode resampling bootstrap dengan B=200 dan B=500 serta metode resampling jackknife Terhapus-1 diperoleh model regresi. Hasil yang didapat dalam penelitian ini, metode jackknife merupakan metode yang efisien dibandingkan dengan metode bootstrap, hal ini didukung dengan kecilnya tingkat standar error dan nilai biasnya yang dihasilkan. Kata Kunci: Regrei, Resampling, Bootsrap, JaccknifeAbstract. The Ordinary least squares method is a standard method for estimating the parameter values of a linear regression model. The method is built based on error assumptions that are identical and independent, and are normally distributed. If the assumptions are not met, this method is not accurate. The alternative to overcome this is to use the resampling method. The resampling method used in this study is bootstrap and jackknife methods. First, estimation of regression parameter values for analysis of poverty data in Makassar City in 2017. The data is secondary data obtained from the BAPPEDA of Makassar City. From the classic assumption test, it is obtained that the model is not homosexedastic and residual is not normally distributed so that the regression model obtained cannot be accounted for. Bootstrap and jackknife methods are introduced here using the R program to find the value of the bias and the standard error values. Parameter estimation of multiple linear regression models from Bootstrap resampling method with B= 200, B= 500 and jackknife deleted-1 resampling method obtained regression models. The results obtained in this study, Jackknife method is an efficient method compared with the bootstrap method, and this is supported by the small standard level error and bias in resulting value.Keywords: regression, resampling, bootstrap, jackknife.


2013 ◽  
Vol 278-280 ◽  
pp. 1323-1326
Author(s):  
Yan Hua Yu ◽  
Li Xia Song ◽  
Kun Lun Zhang

Fuzzy linear regression has been extensively studied since its inception symbolized by the work of Tanaka et al. in 1982. As one of the main estimation methods, fuzzy least squares approach is appealing because it corresponds, to some extent, to the well known statistical regression analysis. In this article, a restricted least squares method is proposed to fit fuzzy linear models with crisp inputs and symmetric fuzzy output. The paper puts forward a kind of fuzzy linear regression model based on structured element, This model has precise input data and fuzzy output data, Gives the regression coefficient and the fuzzy degree function determination method by using the least square method, studies the imitation degree question between the observed value and the forecast value.


2014 ◽  
Vol 71 (1) ◽  
Author(s):  
Bello Abdulkadir Rasheed ◽  
Robiah Adnan ◽  
Seyed Ehsan Saffari ◽  
Kafi Dano Pati

In a linear regression model, the ordinary least squares (OLS) method is considered the best method to estimate the regression parameters if the assumptions are met. However, if the data does not satisfy the underlying assumptions, the results will be misleading. The violation for the assumption of constant variance in the least squares regression is caused by the presence of outliers and heteroscedasticity in the data. This assumption of constant variance (homoscedasticity) is very important in linear regression in which the least squares estimators enjoy the property of minimum variance. Therefor e robust regression method is required to handle the problem of outlier in the data. However, this research will use the weighted least square techniques to estimate the parameter of regression coefficients when the assumption of error variance is violated in the data. Estimation of WLS is the same as carrying out the OLS in a transformed variables procedure. The WLS can easily be affected by outliers. To remedy this, We have suggested a strong technique for the estimation of regression parameters in the existence of heteroscedasticity and outliers. Here we apply the robust regression of M-estimation using iterative reweighted least squares (IRWLS) of Huber and Tukey Bisquare function and resistance regression estimator of least trimmed squares to estimating the model parameters of state-wide crime of united states in 1993. The outcomes from the study indicate the estimators obtained from the M-estimation techniques and the least trimmed method are more effective compared with those obtained from the OLS.


2002 ◽  
Vol 53 (3-4) ◽  
pp. 261-264 ◽  
Author(s):  
Anindya Roy ◽  
Thomas I. Seidman

We derive a property of real sequences which can be used to provide a natural sufficient condition for the consistency of the least squares estimators of slope and intercept for a simple linear regression models.


Sign in / Sign up

Export Citation Format

Share Document