Handling nonpositive curvature in a limited memory steepest descent method

2015 ◽  
Vol 36 (2) ◽  
pp. 717-742 ◽  
Author(s):  
Frank E. Curtis ◽  
Wei Guo
Author(s):  
Ran Gu ◽  
Qiang Du

Abstract How to choose the step size of gradient descent method has been a popular subject of research. In this paper we propose a modified limited memory steepest descent method (MLMSD). In each iteration we propose a selection rule to pick a unique step size from a candidate set, which is calculated by Fletcher’s limited memory steepest descent method (LMSD), instead of going through all the step sizes in a sweep, as in Fletcher’s original LMSD algorithm. MLMSD is motivated by an inexact super-linear convergence rate analysis. The R-linear convergence of MLMSD is proved for a strictly convex quadratic minimization problem. Numerical tests are presented to show that our algorithm is efficient and robust.


1996 ◽  
Vol 3 (3) ◽  
pp. 201-209 ◽  
Author(s):  
Chinmoy Pal ◽  
Ichiro Hagiwara ◽  
Naoki Kayaba ◽  
Shin Morishita

A theoretical formulation of a fast learning method based on a pseudoinverse technique is presented. The efficiency and robustness of the method are verified with the help of an Exclusive OR problem and a dynamic system identification of a linear single degree of freedom mass–spring problem. It is observed that, compared with the conventional backpropagation method, the proposed method has a better convergence rate and a higher degree of learning accuracy with a lower equivalent learning coefficient. It is also found that unlike the steepest descent method, the learning capability of which is dependent on the value of the learning coefficient ν, the proposed pseudoinverse based backpropagation algorithm is comparatively robust with respect to its equivalent variable learning coefficient. A combination of the pseudoinverse method and the steepest descent method is proposed for a faster, more accurate learning capability.


Sign in / Sign up

Export Citation Format

Share Document