On consistency and optimality of Bayesian variable selection based on $$g$$ g -prior in normal linear regression models

2014 ◽  
Vol 67 (5) ◽  
pp. 963-997 ◽  
Author(s):  
Minerva Mukhopadhyay ◽  
Tapas Samanta ◽  
Arijit Chakrabarti
Entropy ◽  
2020 ◽  
Vol 22 (6) ◽  
pp. 661 ◽  
Author(s):  
Shintaro Hashimoto ◽  
Shonosuke Sugasawa

Although linear regression models are fundamental tools in statistical science, the estimation results can be sensitive to outliers. While several robust methods have been proposed in frequentist frameworks, statistical inference is not necessarily straightforward. We here propose a Bayesian approach to robust inference on linear regression models using synthetic posterior distributions based on γ-divergence, which enables us to naturally assess the uncertainty of the estimation through the posterior distribution. We also consider the use of shrinkage priors for the regression coefficients to carry out robust Bayesian variable selection and estimation simultaneously. We develop an efficient posterior computation algorithm by adopting the Bayesian bootstrap within Gibbs sampling. The performance of the proposed method is illustrated through simulation studies and applications to famous datasets.


2020 ◽  
Vol 2020 ◽  
pp. 1-7
Author(s):  
Manickavasagar Kayanan ◽  
Pushpakanthie Wijekoon

Among several variable selection methods, LASSO is the most desirable estimation procedure for handling regularization and variable selection simultaneously in the high-dimensional linear regression models when multicollinearity exists among the predictor variables. Since LASSO is unstable under high multicollinearity, the elastic-net (Enet) estimator has been used to overcome this issue. According to the literature, the estimation of regression parameters can be improved by adding prior information about regression coefficients to the model, which is available in the form of exact or stochastic linear restrictions. In this article, we proposed a stochastic restricted LASSO-type estimator (SRLASSO) by incorporating stochastic linear restrictions. Furthermore, we compared the performance of SRLASSO with LASSO and Enet in root mean square error (RMSE) criterion and mean absolute prediction error (MAPE) criterion based on a Monte Carlo simulation study. Finally, a real-world example was used to demonstrate the performance of SRLASSO.


Sign in / Sign up

Export Citation Format

Share Document