A stepwise discrete variable selection procedure

1977 ◽  
Vol 6 (14) ◽  
pp. 1423-1436 ◽  
Author(s):  
Matthew Goldstein ◽  
William R. Dillon
2018 ◽  
Vol 8 (2) ◽  
pp. 313-341
Author(s):  
Jiajie Chen ◽  
Anthony Hou ◽  
Thomas Y Hou

Abstract In Barber & Candès (2015, Ann. Statist., 43, 2055–2085), the authors introduced a new variable selection procedure called the knockoff filter to control the false discovery rate (FDR) and proved that this method achieves exact FDR control. Inspired by the work by Barber & Candès (2015, Ann. Statist., 43, 2055–2085), we propose a pseudo knockoff filter that inherits some advantages of the original knockoff filter and has more flexibility in constructing its knockoff matrix. Moreover, we perform a number of numerical experiments that seem to suggest that the pseudo knockoff filter with the half Lasso statistic has FDR control and offers more power than the original knockoff filter with the Lasso Path or the half Lasso statistic for the numerical examples that we consider in this paper. Although we cannot establish rigourous FDR control for the pseudo knockoff filter, we provide some partial analysis of the pseudo knockoff filter with the half Lasso statistic and establish a uniform false discovery proportion bound and an expectation inequality.


2007 ◽  
Vol 61 (12) ◽  
pp. 1398-1403 ◽  
Author(s):  
Daewon Lee ◽  
Hyeseon Lee ◽  
Chi-Hyuck Jun ◽  
Chang Hwan Chang

2020 ◽  
Vol 3 (1) ◽  
pp. 66-80 ◽  
Author(s):  
Sierra A. Bainter ◽  
Thomas G. McCauley ◽  
Tor Wager ◽  
Elizabeth A. Reynolds Losin

Frequently, researchers in psychology are faced with the challenge of narrowing down a large set of predictors to a smaller subset. There are a variety of ways to do this, but commonly it is done by choosing predictors with the strongest bivariate correlations with the outcome. However, when predictors are correlated, bivariate relationships may not translate into multivariate relationships. Further, any attempts to control for multiple testing are likely to result in extremely low power. Here we introduce a Bayesian variable-selection procedure frequently used in other disciplines, stochastic search variable selection (SSVS). We apply this technique to choosing the best set of predictors of the perceived unpleasantness of an experimental pain stimulus from among a large group of sociocultural, psychological, and neurobiological (functional MRI) individual-difference measures. Using SSVS provides information about which variables predict the outcome, controlling for uncertainty in the other variables of the model. This approach yields new, useful information to guide the choice of relevant predictors. We have provided Web-based open-source software for performing SSVS and visualizing the results.


Author(s):  
RUNZE LI

In this paper, a new variable selection procedure is introduced for the analysis of uniform design and computer experiment. The new procedure is distinguished from the traditional ones in such a way that it deletes insignificant variables and estimates the coefficients of significant variables simultaneously. The new procedure has an oracle property (Fan and Li8). It is better than the best subset variable selection in terms of computational cost and model stability. It is superior to the stepwise regression because it does not ignore stochastic errors during the course of selecting variables. The proposed procedure is illustrated by two examples, one is a typical example of uniform design, and the other one is a classical example for computer experiment.


Sign in / Sign up

Export Citation Format

Share Document