scholarly journals TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION

2013 ◽  
Vol 18 (1) ◽  
pp. 32-52 ◽  
Author(s):  
Saman Babaie-Kafaki ◽  
Nezam Mahdavi-Amiri

Taking advantage of the attractive features of Hestenes–Stiefel and Dai–Yuan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the efficiency of the proposed methods are reported.

2014 ◽  
Vol 11 (04) ◽  
pp. 1350092 ◽  
Author(s):  
SAMAN BABAIE-KAFAKI

In an attempt to find a reasonable solution for an open problem propounded by Andrei in nonlinear conjugate gradient methods, an adaptive conjugacy condition is proposed. The suggested condition is designed based on an implicit switch from a conjugacy condition to the standard secant equation, using an extended conjugacy condition proposed by Dai and Liao. Following the approach of Dai and Liao, two adaptive nonlinear conjugate gradient methods are proposed based on the suggested adaptive conjugacy condition. An interesting feature of one of the proposed methods is the adaptive switch between the nonlinear conjugate gradient methods proposed by Hestenes and Stiefel, and Perry. Under proper conditions, it is shown that one of the proposed methods is globally convergent for uniformly convex functions and the other is globally convergent for general functions. Numerical results demonstrating the effectiveness of the proposed adaptive approach in the sense of the performance profile introduced by Dolan and Moré are reported.


2020 ◽  
Vol 151 ◽  
pp. 354-366 ◽  
Author(s):  
Shengwei Yao ◽  
Qinliang Feng ◽  
Lue Li ◽  
Jieqiong Xu

2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


Author(s):  
Yutao Zheng

In this paper, a new family of Dai-Liao--type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl., 28: 203--225, 2004] is considered in Dai and Liao's conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.


Sign in / Sign up

Export Citation Format

Share Document