A modified limited memory steepest descent method motivated by an inexact super-linear convergence rate analysis
Keyword(s):
Abstract How to choose the step size of gradient descent method has been a popular subject of research. In this paper we propose a modified limited memory steepest descent method (MLMSD). In each iteration we propose a selection rule to pick a unique step size from a candidate set, which is calculated by Fletcher’s limited memory steepest descent method (LMSD), instead of going through all the step sizes in a sweep, as in Fletcher’s original LMSD algorithm. MLMSD is motivated by an inexact super-linear convergence rate analysis. The R-linear convergence of MLMSD is proved for a strictly convex quadratic minimization problem. Numerical tests are presented to show that our algorithm is efficient and robust.
2021 ◽
Vol 9
(VI)
◽
pp. 1433-1442
2021 ◽
Keyword(s):
2011 ◽
Vol 135
(1-2)
◽
pp. 413-436
◽
2009 ◽
Vol 123
(2)
◽
pp. 339-343
◽
Keyword(s):
2020 ◽
Vol 10
(4)
◽
pp. 4136
1995 ◽
Vol 14
(2)
◽
pp. 369-377
◽
Keyword(s):
2015 ◽
Vol 36
(2)
◽
pp. 717-742
◽
Keyword(s):