Nonconvex Sparse Logistic Regression via Proximal Gradient Descent

Author(s):  
Xinyue Shen ◽  
Yuantao Gu
2018 ◽  
Vol 8 (9) ◽  
pp. 1569 ◽  
Author(s):  
Shengbing Wu ◽  
Hongkun Jiang ◽  
Haiwei Shen ◽  
Ziyi Yang

In recent years, gene selection for cancer classification based on the expression of a small number of gene biomarkers has been the subject of much research in genetics and molecular biology. The successful identification of gene biomarkers will help in the classification of different types of cancer and improve the prediction accuracy. Recently, regularized logistic regression using the L 1 regularization has been successfully applied in high-dimensional cancer classification to tackle both the estimation of gene coefficients and the simultaneous performance of gene selection. However, the L 1 has a biased gene selection and dose not have the oracle property. To address these problems, we investigate L 1 / 2 regularized logistic regression for gene selection in cancer classification. Experimental results on three DNA microarray datasets demonstrate that our proposed method outperforms other commonly used sparse methods ( L 1 and L E N ) in terms of classification performance.


2021 ◽  
Vol 5 (1) ◽  
pp. 22
Author(s):  
Heena Tyagi ◽  
Emma Daulton ◽  
Ayman S. Bannaga ◽  
Ramesh P. Arasaradnam ◽  
James A. Covington

This study outlines the use of an electronic nose as a method for the detection of VOCs as biomarkers of bladder cancer. Here, an AlphaMOS FOX 4000 electronic nose was used for the analysis of urine samples from 15 bladder cancer and 41 non-cancerous patients. The FOX 4000 consists of 18 MOS sensors that were used to differentiate the two groups. The results obtained were analysed using s MultiSens Analyzer and RStudio. The results showed a high separation with sensitivity and specificity of 0.93 and 0.88, respectively, using a Sparse Logistic Regression and 0.93 and 0.76 using a Random Forest classifier. We conclude that the electronic nose shows potential for discriminating bladder cancer from non-cancer subjects using urine samples.


2018 ◽  
Vol 45 (9) ◽  
pp. 4112-4124 ◽  
Author(s):  
Hoda Nemat ◽  
Hamid Fehri ◽  
Nasrin Ahmadinejad ◽  
Alejandro F. Frangi ◽  
Ali Gooya

2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Jan Klosa ◽  
Noah Simon ◽  
Pål Olof Westermark ◽  
Volkmar Liebscher ◽  
Dörte Wittenburg

Abstract Background Statistical analyses of biological problems in life sciences often lead to high-dimensional linear models. To solve the corresponding system of equations, penalization approaches are often the methods of choice. They are especially useful in case of multicollinearity, which appears if the number of explanatory variables exceeds the number of observations or for some biological reason. Then, the model goodness of fit is penalized by some suitable function of interest. Prominent examples are the lasso, group lasso and sparse-group lasso. Here, we offer a fast and numerically cheap implementation of these operators via proximal gradient descent. The grid search for the penalty parameter is realized by warm starts. The step size between consecutive iterations is determined with backtracking line search. Finally, seagull -the R package presented here- produces complete regularization paths. Results Publicly available high-dimensional methylation data are used to compare seagull to the established R package SGL. The results of both packages enabled a precise prediction of biological age from DNA methylation status. But even though the results of seagull and SGL were very similar (R2 > 0.99), seagull computed the solution in a fraction of the time needed by SGL. Additionally, seagull enables the incorporation of weights for each penalized feature. Conclusions The following operators for linear regression models are available in seagull: lasso, group lasso, sparse-group lasso and Integrative LASSO with Penalty Factors (IPF-lasso). Thus, seagull is a convenient envelope of lasso variants.


2020 ◽  
Vol 14 ◽  
Author(s):  
Dong-Wei Chen ◽  
Rui Miao ◽  
Zhao-Yong Deng ◽  
Yue-Yue Lu ◽  
Yong Liang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document