Asymptotic Statistical Analysis of Sparse Group LASSO via Approximate Message Passing

2021 ◽  
pp. 510-526
Author(s):  
Kan Chen ◽  
Zhiqi Bu ◽  
Shiyun Xu
2020 ◽  
Vol 17 (8) ◽  
pp. 187-198
Author(s):  
Chao Li ◽  
Ting Jiang ◽  
Sheng Wu

2017 ◽  
Vol 19 (8) ◽  
pp. 1798-1810 ◽  
Author(s):  
Yun Zhou ◽  
Jianghong Han ◽  
Xiaohui Yuan ◽  
Zhenchun Wei ◽  
Richang Hong

2021 ◽  
Author(s):  
Changkun Han ◽  
Wei Lu ◽  
Pengxin Wang ◽  
Liuyang Song ◽  
Huaqing Wang

2017 ◽  
Vol 16 (06) ◽  
pp. 1707-1727 ◽  
Author(s):  
Morteza Mashayekhi ◽  
Robin Gras

Decision trees are examples of easily interpretable models whose predictive accuracy is normally low. In comparison, decision tree ensembles (DTEs) such as random forest (RF) exhibit high predictive accuracy while being regarded as black-box models. We propose three new rule extraction algorithms from DTEs. The RF[Formula: see text]DHC method, a hill climbing method with downhill moves (DHC), is used to search for a rule set that decreases the number of rules dramatically. In the RF[Formula: see text]SGL and RF[Formula: see text]MSGL methods, the sparse group lasso (SGL) method, and the multiclass SGL (MSGL) method are employed respectively to find a sparse weight vector corresponding to the rules generated by RF. Experimental results with 24 data sets show that the proposed methods outperform similar state-of-the-art methods, in terms of human comprehensibility, by greatly reducing the number of rules and limiting the number of antecedents in the retained rules, while preserving the same level of accuracy.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 4807-4815 ◽  
Author(s):  
Xiangming Meng ◽  
Jiang Zhu

Sign in / Sign up

Export Citation Format

Share Document