scholarly journals Smoothly adaptively centered ridge estimator

2021 ◽  
pp. 104882
Author(s):  
Edoardo Belli
Keyword(s):  
2018 ◽  
Vol 51 (2) ◽  
pp. 165-191 ◽  
Author(s):  
A. K. Md. Ehsanes Saleh ◽  
M. Arashi ◽  
M. Norouzirad ◽  
B M Goalm Kibria

This paper considers the estimation of the parameters of an ANOVA model when sparsity is suspected. Accordingly, we consider the least square estimator (LSE), restricted LSE, preliminary test and Stein-type estimators, together with three penalty estimators, namely, the ridge estimator, subset selection rules (hard threshold estimator) and the LASSO (soft threshold estimator). We compare and contrast the L2-risk of all the estimators with the lower bound of L2-risk of LASSO in a family of diagonal projection scheme which is also the lower bound of the exact L2-risk of LASSO. The result of this comparison is that neither LASSO nor the LSE, preliminary test, and Stein-type estimators outperform each other uniformly. However, when the model is sparse, LASSO outperforms all estimators except “ridge” estimator since both LASSO and ridge are L2-risk equivalent under sparsity. We also find that LASSO and the restricted LSE are L2-risk equivalent and both outperform all estimators (except ridge) depending on the dimension of sparsity. Finally, ridge estimator outperforms all estimators uniformly. Our finding are based on L2-risk of estimators and lower bound of the risk of LASSO together with tables of efficiency and graphical display of efficiency and not based on simulation.


1989 ◽  
Vol 18 (10) ◽  
pp. 3571-3585 ◽  
Author(s):  
S.D. Peddada ◽  
A.K. Nigam ◽  
A.K. Saxena
Keyword(s):  

Author(s):  
Qamar Abdulkareem Abdulazeez ◽  
Zakariya Yahya Algamal

It is well-known that in the presence of multicollinearity, the Liu estimator is an alternative to the ordinary least square (OLS) estimator and the ridge estimator. Generalized Liu estimator (GLE) is a generalization of the Liu estimator. However, the efficiency of GLE depends on appropriately choosing the shrinkage parameter matrix which is involved in the GLE. In this paper, a particle swarm optimization method, which is a metaheuristic continuous algorithm, is proposed to estimate the shrinkage parameter matrix. The simulation study and real application results show the superior performance of the proposed method in terms of prediction error.   


2014 ◽  
Vol 3 (4) ◽  
pp. 146
Author(s):  
HANY DEVITA ◽  
I KOMANG GDE SUKARSA ◽  
I PUTU EKA N. KENCANA

Ordinary least square is a parameter estimations for minimizing residual sum of squares. If the multicollinearity was found in the data, unbias estimator with minimum variance could not be reached. Multicollinearity is a linear correlation between independent variabels in model. Jackknife Ridge Regression(JRR) as an extension of Generalized Ridge Regression (GRR) for solving multicollinearity.  Generalized Ridge Regression is used to overcome the bias of estimators caused of presents multicollinearity by adding different bias parameter for each independent variabel in least square equation after transforming the data into an orthoghonal form. Beside that, JRR can  reduce the bias of the ridge estimator. The result showed that JRR model out performs GRR model.


Sign in / Sign up

Export Citation Format

Share Document