superlinear convergence
Recently Published Documents


TOTAL DOCUMENTS

174
(FIVE YEARS 16)

H-INDEX

24
(FIVE YEARS 2)

Author(s):  
Owe Axelsson ◽  
János Karátson

AbstractThe paper is devoted to Krylov type modifications of the Uzawa method on the operator level for the Stokes problem in order to accelerate convergence. First block preconditioners and their effect on convergence are studied. Then it is shown that a Krylov–Uzawa iteration produces superlinear convergence on smooth domains, and estimation is given on its speed.


Author(s):  
Anton Rodomanov ◽  
Yurii Nesterov

AbstractWe study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form $$(\frac{n L^2}{\mu ^2 k})^{k/2}$$ ( n L 2 μ 2 k ) k / 2 and $$(\frac{n L}{\mu k})^{k/2}$$ ( nL μ k ) k / 2 respectively, where k is the iteration counter, n is the dimension of the problem, $$\mu $$ μ is the strong convexity parameter, and L is the Lipschitz constant of the gradient.


2021 ◽  
Vol 31 (1) ◽  
pp. 785-811
Author(s):  
Anton Rodomanov ◽  
Yurii Nesterov

2020 ◽  
Vol 186 (3) ◽  
pp. 731-758
Author(s):  
Ashkan Mohammadi ◽  
Boris S. Mordukhovich ◽  
M. Ebrahim Sarabi

2020 ◽  
Vol 37 (04) ◽  
pp. 2040001
Author(s):  
Xin-Yuan Zhao ◽  
Liang Chen

In this paper, we conduct a convergence rate analysis of the augmented Lagrangian method with a practical relative error criterion designed in Eckstein and Silva [Mathematical Programming, 141, 319–348 (2013)] for convex nonlinear programming problems. We show that under a mild local error bound condition, this method admits locally a Q-linear rate of convergence. More importantly, we show that the modulus of the convergence rate is inversely proportional to the penalty parameter. That is, an asymptotically superlinear convergence is obtained if the penalty parameter used in the algorithm is increasing to infinity, or an arbitrarily Q-linear rate of convergence can be guaranteed if the penalty parameter is fixed but it is sufficiently large. Besides, as a byproduct, the convergence, as well as the convergence rate, of the distance from the primal sequence to the solution set of the problem is obtained.


Sign in / Sign up

Export Citation Format

Share Document