scholarly journals Greedy Algorithms for High-Dimensional Eigenvalue Problems

2014 ◽  
Vol 40 (3) ◽  
pp. 387-423 ◽  
Author(s):  
Eric Cancès ◽  
Virginie Ehrlacher ◽  
Tony Lelièvre
2019 ◽  
Vol 19 (1) ◽  
pp. 5-22 ◽  
Author(s):  
Peter Benner ◽  
Akwum Onwunta ◽  
Martin Stoll

AbstractThis paper aims at the efficient numerical solution of stochastic eigenvalue problems. Such problems often lead to prohibitively high-dimensional systems with tensor product structure when discretized with the stochastic Galerkin method. Here, we exploit this inherent tensor product structure to develop a globalized low-rank inexact Newton method with which we tackle the stochastic eigenproblem. We illustrate the effectiveness of our solver with numerical experiments.


2021 ◽  
Author(s):  
Xue Yu ◽  
Yifan Sun ◽  
Hai-Jun Zhou

Abstract High-dimensional linear regression model is the most popular statistical model for high-dimensional data, but it is quite a challenging task to achieve a sparse set of regression coefficients. In this paper, we propose a simple heuristic algorithm to construct sparse high-dimensional linear regression models, which is adapted from the shortest-solution guided decimation algorithm and is referred to as ASSD. This algorithm constructs the support of regression coefficients under the guidance of the least-squares solution of the recursively decimated linear equations, and it applies an early-stopping criterion and a second-stage thresholding procedure to refine this support. Our extensive numerical results demonstrate that ASSD outper-forms LASSO, vector approximate message passing, and two other representative greedy algorithms in solution accuracy and robustness. ASSD is especially suitable for linear regression problems with highly correlated measurement matrices encountered in real-world applications.


2013 ◽  
Vol 41 ◽  
pp. 95-131 ◽  
Author(s):  
E. Cancès ◽  
V. Ehrlacher ◽  
T. Lelièvre

Sign in / Sign up

Export Citation Format

Share Document