scholarly journals High dimensional robust M-estimation: asymptotic variance via approximate message passing

2015 ◽  
Vol 166 (3-4) ◽  
pp. 935-969 ◽  
Author(s):  
David Donoho ◽  
Andrea Montanari
2019 ◽  
Vol 9 (1) ◽  
pp. 33-79 ◽  
Author(s):  
Raphaël Berthier ◽  
Andrea Montanari ◽  
Phan-Minh Nguyen

Abstract Given a high-dimensional data matrix $\boldsymbol{A}\in{{\mathbb{R}}}^{m\times n}$, approximate message passing (AMP) algorithms construct sequences of vectors $\boldsymbol{u}^{t}\in{{\mathbb{R}}}^{n}$, ${\boldsymbol v}^{t}\in{{\mathbb{R}}}^{m}$, indexed by $t\in \{0,1,2\dots \}$ by iteratively applying $\boldsymbol{A}$ or $\boldsymbol{A}^{{\textsf T}}$ and suitable nonlinear functions, which depend on the specific application. Special instances of this approach have been developed—among other applications—for compressed sensing reconstruction, robust regression, Bayesian estimation, low-rank matrix recovery, phase retrieval and community detection in graphs. For certain classes of random matrices $\boldsymbol{A}$, AMP admits an asymptotically exact description in the high-dimensional limit $m,n\to \infty $, which goes under the name of state evolution. Earlier work established state evolution for separable nonlinearities (under certain regularity conditions). Nevertheless, empirical work demonstrated several important applications that require non-separable functions. In this paper we generalize state evolution to Lipschitz continuous non-separable nonlinearities, for Gaussian matrices $\boldsymbol{A}$. Our proof makes use of Bolthausen’s conditioning technique along with several approximation arguments. In particular, we introduce a modified algorithm (called LoAMP for Long AMP), which is of independent interest.


2021 ◽  
Author(s):  
Xue Yu ◽  
Yifan Sun ◽  
Hai-Jun Zhou

Abstract High-dimensional linear regression model is the most popular statistical model for high-dimensional data, but it is quite a challenging task to achieve a sparse set of regression coefficients. In this paper, we propose a simple heuristic algorithm to construct sparse high-dimensional linear regression models, which is adapted from the shortest-solution guided decimation algorithm and is referred to as ASSD. This algorithm constructs the support of regression coefficients under the guidance of the least-squares solution of the recursively decimated linear equations, and it applies an early-stopping criterion and a second-stage thresholding procedure to refine this support. Our extensive numerical results demonstrate that ASSD outper-forms LASSO, vector approximate message passing, and two other representative greedy algorithms in solution accuracy and robustness. ASSD is especially suitable for linear regression problems with highly correlated measurement matrices encountered in real-world applications.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Xue Yu ◽  
Yifan Sun ◽  
Hai-Jun Zhou

AbstractHigh-dimensional linear regression model is the most popular statistical model for high-dimensional data, but it is quite a challenging task to achieve a sparse set of regression coefficients. In this paper, we propose a simple heuristic algorithm to construct sparse high-dimensional linear regression models, which is adapted from the shortest-solution guided decimation algorithm and is referred to as ASSD. This algorithm constructs the support of regression coefficients under the guidance of the shortest least-squares solution of the recursively decimated linear models, and it applies an early-stopping criterion and a second-stage thresholding procedure to refine this support. Our extensive numerical results demonstrate that ASSD outperforms LASSO, adaptive LASSO, vector approximate message passing, and two other representative greedy algorithms in solution accuracy and robustness. ASSD is especially suitable for linear regression problems with highly correlated measurement matrices encountered in real-world applications.


2020 ◽  
Vol 17 (8) ◽  
pp. 187-198
Author(s):  
Chao Li ◽  
Ting Jiang ◽  
Sheng Wu

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 4807-4815 ◽  
Author(s):  
Xiangming Meng ◽  
Jiang Zhu

Sign in / Sign up

Export Citation Format

Share Document