scholarly journals THE PREDICTIVE PERFORMANCE EVALUATION AND NUMERICAL EXAMPLE STUDY FOR THE PRINCIPAL COMPONENT TWO-PARAMETERS ESTIMATOR

2019 ◽  
Vol 48 (3) ◽  
pp. 181-186
Author(s):  
R. LI ◽  
F. LI ◽  
J. W. HUANG

In this paper, detailed comparisons are given between those estimators that can be derived from the principal component two-parameter estimator such as the ordinary least squares estimator, the principal components regression estimator, the ridge regression estimator, the Liu estimator, the r-k estimator and the r-d estimator by the prediction mean square error criterion. In addition, conditions for the superiority of the principal component two-parameter estimator over the others are obtained. Furthermore, a numerical example study is conducted to compare these estimators under the prediction mean squared error criterion.

2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Jibo Wu

Wu (2013) proposed an estimator, principal component Liu-type estimator, to overcome multicollinearity. This estimator is a general estimator which includes ordinary least squares estimator, principal component regression estimator, ridge estimator, Liu estimator, Liu-type estimator,r-kclass estimator, andr-dclass estimator. In this paper, firstly we use a new method to propose the principal component Liu-type estimator; then we study the superior of the new estimator by using the scalar mean squares error criterion. Finally, we give a numerical example to show the theoretical results.


Author(s):  
Sacha Varin

Robust regression techniques are relevant tools for investigating data contaminated with influential observations. The article briefly reviews and describes 7 robust estimators for linear regression, including popular ones (Huber M, Tukey’s bisquare M, least absolute deviation also called L1 or median regression), some that combine high breakdown and high efficiency [fast MM (Modified M-estimator), fast ?-estimator and HBR (High breakdown rank-based)], and one to handle small samples (Distance-constrained maximum likelihood (DCML)). We include the fast MM and fast ?-estimators because we use the fast-robust bootstrap (FRB) for MM and ?-estimators. Our objective is to compare the predictive performance on a real data application using OLS (Ordinary least squares) and to propose alternatives by using 7 different robust estimations. We also run simulations under various combinations of 4 factors: sample sizes, percentage of outliers, percentage of leverage and number of covariates. The predictive performance is evaluated by crossvalidation and minimizing the mean squared error (MSE). We use the R language for data analysis. In the real dataset OLS provides the best prediction. DCML and popular robust estimators give good predictive results as well, especially the Huber M-estimator. In simulations involving 3 predictors and n=50, the results clearly favor fast MM, fast ?-estimator and HBR whatever the proportion of outliers. DCML and Tukey M are also good estimators when n=50, especially when the percentage of outliers is small (5% and 10%%). With 10 predictors, however, HBR, fast MM, fast ? and especially DCML give better results for n=50. HBR, fast MM and DCML provide better results for n=500. For n=5000 all the robust estimators give the same results independently of the percentage of outliers. If we vary the percentages of outliers and leverage points simultaneously, DCML, fast MM and HBR are good estimators for n=50 and p=3. For n=500, fast MM, fast ? and HBR provi


2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Jibo Wu

We introduce an unbiased two-parameter estimator based on prior information and two-parameter estimator proposed by Özkale and Kaçıranlar, 2007. Then we discuss its properties and our results show that the new estimator is better than the two-parameter estimator, the ordinary least squares estimator, and explain the almost unbiased two-parameter estimator which is proposed by Wu and Yang, 2013. Finally, we give a simulation study to show the theoretical results.


2019 ◽  
Vol 2019 ◽  
pp. 1-10 ◽  
Author(s):  
Adewale F. Lukman ◽  
Kayode Ayinde ◽  
Sek Siok Kun ◽  
Emmanuel T. Adewuyi

The literature has shown that ordinary least squares estimator (OLSE) is not best when the explanatory variables are related, that is, when multicollinearity is present. This estimator becomes unstable and gives a misleading conclusion. In this study, a modified new two-parameter estimator based on prior information for the vector of parameters is proposed to circumvent the problem of multicollinearity. This new estimator includes the special cases of the ordinary least squares estimator (OLSE), the ridge estimator (RRE), the Liu estimator (LE), the modified ridge estimator (MRE), and the modified Liu estimator (MLE). Furthermore, the superiority of the new estimator over OLSE, RRE, LE, MRE, MLE, and the two-parameter estimator proposed by Ozkale and Kaciranlar (2007) was obtained by using the mean squared error matrix criterion. In conclusion, a numerical example and a simulation study were conducted to illustrate the theoretical results.


Sign in / Sign up

Export Citation Format

Share Document