scholarly journals Fully Gibbs Sampling Algorithms for Bayesian Variable Selection in Latent Regression Models

2021 ◽  
Author(s):  
Kazuhiro Yamaguchi ◽  
Jihong Zhang

This study proposed efficient Gibbs sampling algorithms for variable selection in a latent regression model under a unidimensional two-parameter logistic item response theory model. Three types of shrinkage priors were employed to obtain shrinkage estimates: double-exponential (i.e., Laplace), horseshoe, and horseshoe+ priors. These shrinkage priors were compared to a uniform prior case in both simulation and real data analysis. The simulation study revealed that two types of horseshoe priors had a smaller root mean square errors and shorter 95% credible interval lengths than double-exponential or uniform priors. In addition, the horseshoe prior+ was slightly more stable than the horseshoe prior. The real data example successfully proved the utility of horseshoe and horseshoe+ priors in selecting effective predictive covariates for math achievement. In the final section, we discuss the benefits and limitations of the three types of Bayesian variable selection methods.

2021 ◽  
Vol 26 (5) ◽  
pp. 44-57
Author(s):  
Zainab Sami ◽  
Taha Alshaybawee

Lasso variable selection is an attractive approach to improve the prediction accuracy. Bayesian lasso approach is suggested to estimate and select the important variables for single index logistic regression model. Laplace distribution is set as prior to the coefficients vector and prior to the unknown link function (Gaussian process). A hierarchical Bayesian lasso semiparametric logistic regression model is constructed and MCMC algorithm is adopted for posterior inference. To evaluate the performance of the proposed method BSLLR is through comparing it to three existing methods BLR, BPR and BBQR. Simulation examples and numerical data are to be considered. The results indicate that the proposed method get the smallest bias, SD, MSE and MAE in simulation and real data. The proposed method BSLLR performs better than other methods. 


Sankhya A ◽  
2017 ◽  
Vol 80 (2) ◽  
pp. 215-246 ◽  
Author(s):  
Xueying Tang ◽  
Xiaofan Xu ◽  
Malay Ghosh ◽  
Prasenjit Ghosh

Author(s):  
Kaito Shimamura ◽  
Shuichi Kawano

AbstractSparse convex clustering is to group observations and conduct variable selection simultaneously in the framework of convex clustering. Although a weighted $$L_1$$ L 1 norm is usually employed for the regularization term in sparse convex clustering, its use increases the dependence on the data and reduces the estimation accuracy if the sample size is not sufficient. To tackle these problems, this paper proposes a Bayesian sparse convex clustering method based on the ideas of Bayesian lasso and global-local shrinkage priors. We introduce Gibbs sampling algorithms for our method using scale mixtures of normal distributions. The effectiveness of the proposed methods is shown in simulation studies and a real data analysis.


2021 ◽  
Vol 12 ◽  
Author(s):  
Xi Lu ◽  
Kun Fan ◽  
Jie Ren ◽  
Cen Wu

In high-throughput genetics studies, an important aim is to identify gene–environment interactions associated with the clinical outcomes. Recently, multiple marginal penalization methods have been developed and shown to be effective in G×E studies. However, within the Bayesian framework, marginal variable selection has not received much attention. In this study, we propose a novel marginal Bayesian variable selection method for G×E studies. In particular, our marginal Bayesian method is robust to data contamination and outliers in the outcome variables. With the incorporation of spike-and-slab priors, we have implemented the Gibbs sampler based on Markov Chain Monte Carlo (MCMC). The proposed method outperforms a number of alternatives in extensive simulation studies. The utility of the marginal robust Bayesian variable selection method has been further demonstrated in the case studies using data from the Nurse Health Study (NHS). Some of the identified main and interaction effects from the real data analysis have important biological implications.


2019 ◽  
Author(s):  
Sierra Bainter ◽  
Thomas Granville McCauley ◽  
Tor D Wager ◽  
Elizabeth Reynolds Losin

In this paper we address the problem of selecting important predictors from some larger set of candidate predictors. Standard techniques are limited by lack of power and high false positive rates. A Bayesian variable selection approach used widely in biostatistics, stochastic search variable selection, can be used instead to combat these issues by accounting for uncertainty in the other predictors of the model. In this paper we present Bayesian variable selection to aid researchers facing this common scenario, along with an online application (https://ssvsforpsych.shinyapps.io/ssvsforpsych/) to perform the analysis and visualize the results. Using an application to predict pain ratings, we demonstrate how this approach quickly identifies reliable predictors, even when the set of possible predictors is larger than the sample size. This technique is widely applicable to research questions that may be relatively data-rich, but with limited information or theory to guide variable selection.


2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Matthew D. Koslovsky ◽  
Marina Vannucci

An amendment to this paper has been published and can be accessed via the original article.


Author(s):  
Yinsen Miao ◽  
Jeong Hwan Kook ◽  
Yadong Lu ◽  
Michele Guindani ◽  
Marina Vannucci

Sign in / Sign up

Export Citation Format

Share Document