generalized lasso
Recently Published Documents


TOTAL DOCUMENTS

29
(FIVE YEARS 2)

H-INDEX

8
(FIVE YEARS 0)

Author(s):  
Wencan Zhu ◽  
Céline Lévy-Leduc ◽  
Nils Ternès

Abstract Motivation In genomic studies, identifying biomarkers associated with a variable of interest is a major concern in biomedical research. Regularized approaches are classically used to perform variable selection in high-dimensional linear models. However, these methods can fail in highly correlated settings. Results We propose a novel variable selection approach called WLasso, taking these correlations into account. It consists in rewriting the initial high-dimensional linear model to remove the correlation between the biomarkers (predictors) and in applying the generalized Lasso criterion. The performance of WLasso is assessed using synthetic data in several scenarios and compared with recent alternative approaches. The results show that when the biomarkers are highly correlated, WLasso outperforms the other approaches in sparse high-dimensional frameworks. The method is also illustrated on publicly available gene expression data in breast cancer. Availabilityand implementation Our method is implemented in the WLasso R package which is available from the Comprehensive R Archive Network (CRAN). Supplementary information Supplementary data are available at Bioinformatics online.



IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Wenyu Liu ◽  
Hao Yan ◽  
Qing Wang ◽  
Nan Hu ◽  
Ni Liu ◽  
...  


Author(s):  
Aaron Berk ◽  
Yaniv Plan ◽  
Özgür Yilmaz

Abstract The use of generalized Lasso is a common technique for recovery of structured high-dimensional signals. There are three common formulations of generalized Lasso; each program has a governing parameter whose optimal value depends on properties of the data. At this optimal value, compressed sensing theory explains why Lasso programs recover structured high-dimensional signals with minimax order-optimal error. Unfortunately in practice, the optimal choice is generally unknown and must be estimated. Thus, we investigate stability of each of the three Lasso programs with respect to its governing parameter. Our goal is to aid the practitioner in answering the following question: given real data, which Lasso program should be used? We take a step towards answering this by analysing the case where the measurement matrix is identity (the so-called proximal denoising setup) and we use $\ell _{1}$ regularization. For each Lasso program, we specify settings in which that program is provably unstable with respect to its governing parameter. We support our analysis with detailed numerical simulations. For example, there are settings where a 0.1% underestimate of a Lasso parameter can increase the error significantly and a 50% underestimate can cause the error to increase by a factor of $10^{9}$.



2020 ◽  
Vol 49 (1) ◽  
pp. 30-55
Author(s):  
Markus Grasmair ◽  
Timo Klock ◽  
Valeriya Naumova
Keyword(s):  




2020 ◽  
Vol 66 (4) ◽  
pp. 2487-2500 ◽  
Author(s):  
Christos Thrampoulidis ◽  
Ankit Singh Rawat


2020 ◽  
Vol 27 ◽  
pp. 356-360 ◽  
Author(s):  
Xianghui Wang ◽  
Jacob Benesty ◽  
Jingdong Chen ◽  
Israel Cohen


2019 ◽  
Vol 13 (2) ◽  
pp. 2307-2347
Author(s):  
Alnur Ali ◽  
Ryan J. Tibshirani


2018 ◽  
Vol 40 (12) ◽  
pp. 2992-3006 ◽  
Author(s):  
Shaogang Ren ◽  
Shuai Huang ◽  
Jieping Ye ◽  
Xiaoning Qian


Sign in / Sign up

Export Citation Format

Share Document