Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization

2011 ◽  
Vol 153 (3) ◽  
pp. 733-757 ◽  
Author(s):  
Kaori Sugiki ◽  
Yasushi Narushima ◽  
Hiroshi Yabe
Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Meixing Liu ◽  
Guodong Ma ◽  
Jianghua Yin

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.


2020 ◽  
Vol 151 ◽  
pp. 354-366 ◽  
Author(s):  
Shengwei Yao ◽  
Qinliang Feng ◽  
Lue Li ◽  
Jieqiong Xu

2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


Sign in / Sign up

Export Citation Format

Share Document