MATRIX ANALYSES ON THE DAI–LIAO CONJUGATE GRADIENT METHOD

2019 ◽  
Vol 61 (02) ◽  
pp. 195-203
Author(s):  
Z. AMINIFARD ◽  
S. BABAIE-KAFAKI

Some optimal choices for a parameter of the Dai–Liao conjugate gradient method are proposed by conducting matrix analyses of the method. More precisely, first the $\ell _{1}$ and $\ell _{\infty }$ norm condition numbers of the search direction matrix are minimized, yielding two adaptive choices for the Dai–Liao parameter. Then we show that a recent formula for computing this parameter which guarantees the descent property can be considered as a minimizer of the spectral condition number as well as the well-known measure function for a symmetrized version of the search direction matrix. Brief convergence analyses are also carried out. Finally, some numerical experiments on a set of test problems related to constrained and unconstrained testing environment, are conducted using a well-known performance profile.

2011 ◽  
Vol 58-60 ◽  
pp. 943-949
Author(s):  
Wan You Cheng ◽  
Xue Jie Liu

In this paper, on the basis of the recently developed HZ (Hager-Zhang) method [SIAM J. Optim., 16 (2005), pp. 170-192], we propose a hybrid descent conjugate gradient method which reserves the sufficient descent property of the HZ method. Under suitable conditions, we prove the global convergence of the proposed method. Extensive numerical experiments show that the method is promising for the test problems from the CUTE library.


2020 ◽  
Vol 25 (1) ◽  
pp. 128
Author(s):  
SHAHER QAHTAN HUSSEIN1 ◽  
GHASSAN EZZULDDIN ARIF1 ◽  
YOKSAL ABDLL SATTAR2

In this paper we can derive a new search direction of conjugating gradient method associated with (Dai-Liao method ) the new algorithm becomes converged by assuming some hypothesis. We are also able to prove the Descent property for the new method, numerical results showed for the proposed method is effective comparing with the (FR, HS and DY) methods.   http://dx.doi.org/10.25130/tjps.25.2020.019    


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


Sign in / Sign up

Export Citation Format

Share Document