scholarly journals A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization

2013 ◽  
Vol 239 ◽  
pp. 396-405 ◽  
Author(s):  
Ioannis E. Livieris ◽  
Panagiotis Pintelas
Author(s):  
Yutao Zheng

In this paper, a new family of Dai-Liao--type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl., 28: 203--225, 2004] is considered in Dai and Liao's conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.


2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


Sign in / Sign up

Export Citation Format

Share Document