dantzig selector
Recently Published Documents


TOTAL DOCUMENTS

66
(FIVE YEARS 10)

H-INDEX

15
(FIVE YEARS 1)

2021 ◽  
Vol 38 (1) ◽  
pp. 015006
Author(s):  
Huanmin Ge ◽  
Peng Li

Abstract In the paper, we proposed the Dantzig selector based on the ℓ 1 − αℓ 2 (0 < α ⩽ 1) minimization for the signal recovery. In the Dantzig selector, the constraint ‖ A ⊤ ( b − Ax )‖∞ ⩽ η for some small constant η > 0 means the columns of A has very weakly correlated with the error vector e = Ax − b . First, recovery guarantees based on the restricted isometry property are established for signals. Next, we propose the effective algorithm to solve the proposed Dantzig selector. Last, we illustrate the proposed model and algorithm by extensive numerical experiments for the recovery of signals in the cases of Gaussian, impulsive and uniform noises. And the performance of the proposed Dantzig selector is better than that of the existing methods.


2021 ◽  
Vol 43 (6) ◽  
pp. A4147-A4171
Author(s):  
Sheng Fang ◽  
Yong-Jin Liu ◽  
Xianzhu Xiong

2019 ◽  
Vol 30 (3) ◽  
pp. 697-719 ◽  
Author(s):  
Fan Wang ◽  
Sach Mukherjee ◽  
Sylvia Richardson ◽  
Steven M. Hill

AbstractPenalized likelihood approaches are widely used for high-dimensional regression. Although many methods have been proposed and the associated theory is now well developed, the relative efficacy of different approaches in finite-sample settings, as encountered in practice, remains incompletely understood. There is therefore a need for empirical investigations in this area that can offer practical insight and guidance to users. In this paper, we present a large-scale comparison of penalized regression methods. We distinguish between three related goals: prediction, variable selection and variable ranking. Our results span more than 2300 data-generating scenarios, including both synthetic and semisynthetic data (real covariates and simulated responses), allowing us to systematically consider the influence of various factors (sample size, dimensionality, sparsity, signal strength and multicollinearity). We consider several widely used approaches (Lasso, Adaptive Lasso, Elastic Net, Ridge Regression, SCAD, the Dantzig Selector and Stability Selection). We find considerable variation in performance between methods. Our results support a “no panacea” view, with no unambiguous winner across all scenarios or goals, even in this restricted setting where all data align well with the assumptions underlying the methods. The study allows us to make some recommendations as to which approaches may be most (or least) suitable given the goal and some data characteristics. Our empirical results complement existing theory and provide a resource to compare methods across a range of scenarios and metrics.


2019 ◽  
Author(s):  
Anna Pidnebesna ◽  
Iveta Fajnerová ◽  
Jiří Horáček ◽  
Jaroslav Hlinka

AbstractThe approximate knowledge of the hemodynamic response to neuronal activity is widely used in statistical testing of effects of external stimulation, but has also been applied to estimate the neuronal activity directly from functional magnetic resonance data without knowing the stimulus timing. To this end, sparse linear regression methods have been previously used, including the well-known LASSO and the Dantzig selector. These methods generate a parametric family of solutions with different sparsity, among which a choice is finally based using some information criteria. As an alternative we propose a novel approach that instead utilizes the whole family of sparse regression solutions. Their ensemble provides a first approximation of probability of activation at each timepoint, and together with the conditional neuronal activity distributions estimated with the theory of mixtures with varying concentrations, they serve as the inputs to a Bayes classifier ultimately deciding between the true and false activations.As we show in extensive numerical simulations, the new method performs favourably in comparison with standard approaches in a range of realistic scenarios. This is mainly due to the avoidance of overfitting and underfitting that commonly plague the solutions based on sparse regression combined with model selection methods, including the corrected Akaike Information Criterion. This advantage is finally documented on fMRI task dataset.


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
Jianfeng Wang ◽  
Zhiyong Zhou ◽  
Jun Yu

AbstractIn this paper, we introduce the q-ratio block constrained minimal singular values (BCMSV) as a new measure of measurement matrix in compressive sensing of block sparse/compressive signals and present an algorithm for computing this new measure. Both the mixed ℓ2/ℓq and the mixed ℓ2/ℓ1 norms of the reconstruction errors for stable and robust recovery using block basis pursuit (BBP), the block Dantzig selector (BDS), and the group lasso in terms of the q-ratio BCMSV are investigated. We establish a sufficient condition based on the q-ratio block sparsity for the exact recovery from the noise-free BBP and developed a convex-concave procedure to solve the corresponding non-convex problem in the condition. Furthermore, we prove that for sub-Gaussian random matrices, the q-ratio BCMSV is bounded away from zero with high probability when the number of measurements is reasonably large. Numerical experiments are implemented to illustrate the theoretical results. In addition, we demonstrate that the q-ratio BCMSV-based error bounds are tighter than the block-restricted isotropic constant-based bounds.


2019 ◽  
Vol 12 (07) ◽  
pp. 2050143
Author(s):  
Chol-Guk Choe ◽  
Myong-Gil Rim ◽  
Ji-Song Ryang

This paper considers recovery of signals that are sparse or approximately sparse in terms of a general frame from undersampled data corrupted with additive noise. We show that the properly constrained [Formula: see text]-analysis, called general-dual-based analysis Dantzig selector, stably recovers a signal which is nearly sparse in terms of a general dual frame provided that the measurement matrix satisfies a restricted isometry property adapted to the general frame. As a special case, we consider the Gaussian noise.


Author(s):  
Pengbo Geng ◽  
Peng Li ◽  
Wengu Chen

This paper considers the recovery condition of signals from undersampled data corrupted with additive noise in the framework of cumulative coherence. We establish some sufficient conditions which can guarantee the stable recovery from quadratically constrained basis pursuit (QCBP), Dantzig selector (DS) and Lasso estimator.


Sign in / Sign up

Export Citation Format

Share Document