A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family

2015 ◽  
Vol 30 (4) ◽  
pp. 843-863 ◽  
Author(s):  
M. Reza Peyghami ◽  
H. Ahmadzadeh ◽  
A. Fazli
2020 ◽  
Vol 151 ◽  
pp. 354-366 ◽  
Author(s):  
Shengwei Yao ◽  
Qinliang Feng ◽  
Lue Li ◽  
Jieqiong Xu

CALCOLO ◽  
2018 ◽  
Vol 56 (1) ◽  
Author(s):  
Farzad Rahpeymaii ◽  
Keyvan Amini ◽  
Tofigh Allahviranloo ◽  
Mohsen Rostamy Malkhalifeh

2014 ◽  
Vol 11 (04) ◽  
pp. 1350092 ◽  
Author(s):  
SAMAN BABAIE-KAFAKI

In an attempt to find a reasonable solution for an open problem propounded by Andrei in nonlinear conjugate gradient methods, an adaptive conjugacy condition is proposed. The suggested condition is designed based on an implicit switch from a conjugacy condition to the standard secant equation, using an extended conjugacy condition proposed by Dai and Liao. Following the approach of Dai and Liao, two adaptive nonlinear conjugate gradient methods are proposed based on the suggested adaptive conjugacy condition. An interesting feature of one of the proposed methods is the adaptive switch between the nonlinear conjugate gradient methods proposed by Hestenes and Stiefel, and Perry. Under proper conditions, it is shown that one of the proposed methods is globally convergent for uniformly convex functions and the other is globally convergent for general functions. Numerical results demonstrating the effectiveness of the proposed adaptive approach in the sense of the performance profile introduced by Dolan and Moré are reported.


2013 ◽  
Vol 18 (1) ◽  
pp. 32-52 ◽  
Author(s):  
Saman Babaie-Kafaki ◽  
Nezam Mahdavi-Amiri

Taking advantage of the attractive features of Hestenes–Stiefel and Dai–Yuan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the efficiency of the proposed methods are reported.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Meixing Liu ◽  
Guodong Ma ◽  
Jianghua Yin

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.


Sign in / Sign up

Export Citation Format

Share Document