scholarly journals Rates of superlinear convergence for classical quasi-Newton methods

Author(s):  
Anton Rodomanov ◽  
Yurii Nesterov

AbstractWe study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form $$(\frac{n L^2}{\mu ^2 k})^{k/2}$$ ( n L 2 μ 2 k ) k / 2 and $$(\frac{n L}{\mu k})^{k/2}$$ ( nL μ k ) k / 2 respectively, where k is the iteration counter, n is the dimension of the problem, $$\mu $$ μ is the strong convexity parameter, and L is the Lipschitz constant of the gradient.

2015 ◽  
Vol 25 (3) ◽  
pp. 1660-1685 ◽  
Author(s):  
Wen Huang ◽  
K. A. Gallivan ◽  
P.-A. Absil

2013 ◽  
Vol 58 (1) ◽  
pp. 225-247 ◽  
Author(s):  
F. J. Aragón Artacho ◽  
A. Belyakov ◽  
A. L. Dontchev ◽  
M. López

Sign in / Sign up

Export Citation Format

Share Document