scholarly journals Anderson Acceleration of the Arnoldi-Inout Method for Computing PageRank

Symmetry ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 636
Author(s):  
Xia Tang ◽  
Chun Wen ◽  
Xian-Ming Gu ◽  
Zhao-Li Shen

Anderson(m0) extrapolation, an accelerator to a fixed-point iteration, stores m0+1 prior evaluations of the fixed-point iteration and computes a linear combination of those evaluations as a new iteration. The computational cost of the Anderson(m0) acceleration becomes expensive with the parameter m0 increasing, thus m0 is a common choice in most practice. In this paper, with the aim of improving the computations of PageRank problems, a new method was developed by applying Anderson(1) extrapolation at periodic intervals within the Arnoldi-Inout method. The new method is called the AIOA method. Convergence analysis of the AIOA method is discussed in detail. Numerical results on several PageRank problems are presented to illustrate the effectiveness of our proposed method.

Geophysics ◽  
2021 ◽  
Vol 86 (1) ◽  
pp. R99-R108
Author(s):  
Yunan Yang

State-of-the-art seismic imaging techniques treat inversion tasks such as full-waveform inversion (FWI) and least-squares reverse time migration (LSRTM) as partial differential equation-constrained optimization problems. Due to the large-scale nature, gradient-based optimization algorithms are preferred in practice to update the model iteratively. Higher-order methods converge in fewer iterations but often require higher computational costs, more line-search steps, and bigger memory storage. A balance among these aspects has to be considered. We have conducted an evaluation using Anderson acceleration (AA), a popular strategy to speed up the convergence of fixed-point iterations, to accelerate the steepest-descent algorithm, which we innovatively treat as a fixed-point iteration. Independent of the unknown parameter dimensionality, the computational cost of implementing the method can be reduced to an extremely low dimensional least-squares problem. The cost can be further reduced by a low-rank update. We determine the theoretical connections and the differences between AA and other well-known optimization methods such as L-BFGS and the restarted generalized minimal residual method and compare their computational cost and memory requirements. Numerical examples of FWI and LSRTM applied to the Marmousi benchmark demonstrate the acceleration effects of AA. Compared with the steepest-descent method, AA can achieve faster convergence and can provide competitive results with some quasi-Newton methods, making it an attractive optimization strategy for seismic inversion.


2016 ◽  
Vol 32 (3) ◽  
pp. 277-284
Author(s):  
GHEORGHE ARDELEAN ◽  
◽  
OVIDIU COSMA ◽  
LASZLO BALOG ◽  
◽  
...  

Several iterative processes have been defined by researchers to approximate the fixed points of various classes operators. In this paper we present, by using the basins of attraction for the roots of some complex polynomials, an empirical comparison of some iteration procedures for fixed points approximation of Newton’s iteration operator. Some numerical results are presented. The Matlab m-files for generating the basins of attraction are presented, too.


2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Na Huang ◽  
Changfeng Ma

We present a fixed-point iterative method for solving systems of nonlinear equations. The convergence theorem of the proposed method is proved under suitable conditions. In addition, some numerical results are also reported in the paper, which confirm the good theoretical properties of our approach.


2015 ◽  
Vol 18 (5) ◽  
pp. 1313-1335 ◽  
Author(s):  
Xiaoqiang Yue ◽  
Shi Shu ◽  
Xiao wen Xu ◽  
Zhiyang Zhou

AbstractThe paper aims to develop an effective preconditioner and conduct the convergence analysis of the corresponding preconditioned GMRES for the solution of discrete problems originating from multi-group radiation diffusion equations. We firstly investigate the performances of the most widely used preconditioners (ILU(k) and AMG) and their combinations (Bco and Bco), and provide drawbacks on their feasibilities. Secondly, we reveal the underlying complementarity of ILU(k) and AMG by analyzing the features suitable for AMG using more detailed measurements on multiscale nature of matrices and the effect of ILU(k) on multiscale nature. Moreover, we present an adaptive combined preconditioner Bcoα involving an improved ILU(0) along with its convergence constraints. Numerical results demonstrate that Bcoα-GMRES holds the best robustness and efficiency. At last, we analyze the convergence of GMRES with combined preconditioning which not only provides a persuasive support for our proposed algorithms, but also updates the existing estimation theory on condition numbers of combined preconditioned systems.


IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 18383-18392
Author(s):  
Younghan Jeon ◽  
Minsik Lee ◽  
Jin Young Choi

Sign in / Sign up

Export Citation Format

Share Document