scholarly journals Comparison of Some Estimators under the Pitman’s Closeness Criterion in Linear Regression Model

2014 ◽  
Vol 2014 ◽  
pp. 1-6
Author(s):  
Jibo Wu

Batah et al. (2009) combined the unbiased ridge estimator and principal components regression estimator and introduced the modifiedr-kclass estimator. They also showed that the modifiedr-kclass estimator is superior to the ordinary least squares estimator and principal components regression estimator in the mean squared error matrix. In this paper, firstly, we will give a new method to obtain the modifiedr-kclass estimator; secondly, we will discuss its properties in some detail, comparing the modifiedr-kclass estimator to the ordinary least squares estimator and principal components regression estimator under the Pitman closeness criterion. A numerical example and a simulation study are given to illustrate our findings.

2019 ◽  
Vol 2019 ◽  
pp. 1-10 ◽  
Author(s):  
Adewale F. Lukman ◽  
Kayode Ayinde ◽  
Sek Siok Kun ◽  
Emmanuel T. Adewuyi

The literature has shown that ordinary least squares estimator (OLSE) is not best when the explanatory variables are related, that is, when multicollinearity is present. This estimator becomes unstable and gives a misleading conclusion. In this study, a modified new two-parameter estimator based on prior information for the vector of parameters is proposed to circumvent the problem of multicollinearity. This new estimator includes the special cases of the ordinary least squares estimator (OLSE), the ridge estimator (RRE), the Liu estimator (LE), the modified ridge estimator (MRE), and the modified Liu estimator (MLE). Furthermore, the superiority of the new estimator over OLSE, RRE, LE, MRE, MLE, and the two-parameter estimator proposed by Ozkale and Kaciranlar (2007) was obtained by using the mean squared error matrix criterion. In conclusion, a numerical example and a simulation study were conducted to illustrate the theoretical results.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
Daojiang He ◽  
Yan Wu

We propose a new estimator to combat the multicollinearity in the linear model when there are stochastic linear restrictions on the regression coefficients. The new estimator is constructed by combining the ordinary mixed estimator (OME) and the principal components regression (PCR) estimator, which is called the stochastic restricted principal components (SRPC) regression estimator. Necessary and sufficient conditions for the superiority of the SRPC estimator over the OME and the PCR estimator are derived in the sense of the mean squared error matrix criterion. Finally, we give a numerical example and a Monte Carlo study to illustrate the performance of the proposed estimator.


Author(s):  
Jibo Wu

Schaffrin and Toutenburg [1] proposed a weighted mixed estimation based on the sample information and the stochastic prior information, and they also show that the weighted mixed estimator is superior to the ordinary least squares estimator under the mean squared error criterion. However, there has no paper to discuss the performance of the two estimators under the Pitman’s closeness criterion. This paper presents the comparison of the weighted mixed estimator and the ordinary least squares estimator using the Pitman’s closeness criterion. A simulation study is performed to illustrate the performance of the weighted mixed estimator and the ordinary least squares estimator under the Pitman’s closeness criterion.


2002 ◽  
Vol 18 (5) ◽  
pp. 1121-1138 ◽  
Author(s):  
DONG WAN SHIN ◽  
MAN SUK OH

For regression models with general unstable regressors having characteristic roots on the unit circle and general stationary errors independent of the regressors, sufficient conditions are investigated under which the ordinary least squares estimator (OLSE) is asymptotically efficient in that it has the same limiting distribution as the generalized least squares estimator (GLSE) under the same normalization. A key condition for the asymptotic efficiency of the OLSE is that one multiplicity of a characteristic root of the regressor process is strictly greater than the multiplicities of the other roots. Under this condition, the covariance matrix Γ of the errors and the regressor matrix X are shown to satisfy a relationship (ΓX = XC + V for some matrix C) for V asymptotically dominated by X, which is analogous to the condition (ΓX = XC for some matrix C) for numerical equivalence of the OLSE and the GLSE.


Sign in / Sign up

Export Citation Format

Share Document