An efficient Dai–Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation

2020 ◽  
Vol 371 ◽  
pp. 112708
Author(s):  
Mina Lotfi ◽  
S.Mohammad Hosseini
2019 ◽  
Vol 61 (02) ◽  
pp. 195-203
Author(s):  
Z. AMINIFARD ◽  
S. BABAIE-KAFAKI

Some optimal choices for a parameter of the Dai–Liao conjugate gradient method are proposed by conducting matrix analyses of the method. More precisely, first the $\ell _{1}$ and $\ell _{\infty }$ norm condition numbers of the search direction matrix are minimized, yielding two adaptive choices for the Dai–Liao parameter. Then we show that a recent formula for computing this parameter which guarantees the descent property can be considered as a minimizer of the spectral condition number as well as the well-known measure function for a symmetrized version of the search direction matrix. Brief convergence analyses are also carried out. Finally, some numerical experiments on a set of test problems related to constrained and unconstrained testing environment, are conducted using a well-known performance profile.


2020 ◽  
Vol 54 (4) ◽  
pp. 981-991
Author(s):  
Zohre Aminifard ◽  
Saman Babaie-Kafaki

As known, finding an effective restart procedure for the conjugate gradient methods has been considered as an open problem. Here, we aim to study the problem for the Dai–Liao conjugate gradient method. In this context, based on a singular value analysis conducted on the Dai–Liao search direction matrix, it is shown that when the gradient approximately lies in the direction of the maximum magnification by the matrix, the method may get into some computational errors as well as it may converge hardly. In such situation, ignoring the Dai–Liao search direction in the sense of performing a restart may enhance the numerical stability as well as may accelerate the convergence. Numerical results are reported; they demonstrate effectiveness of the suggested restart procedure in the sense of the Dolan–Moré performance profile.


2020 ◽  
Vol 25 (1) ◽  
pp. 128
Author(s):  
SHAHER QAHTAN HUSSEIN1 ◽  
GHASSAN EZZULDDIN ARIF1 ◽  
YOKSAL ABDLL SATTAR2

In this paper we can derive a new search direction of conjugating gradient method associated with (Dai-Liao method ) the new algorithm becomes converged by assuming some hypothesis. We are also able to prove the Descent property for the new method, numerical results showed for the proposed method is effective comparing with the (FR, HS and DY) methods.   http://dx.doi.org/10.25130/tjps.25.2020.019    


Author(s):  
Azwar Riza Habibi ◽  
Vivi Aida Fitria ◽  
Lukman Hakim

This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this method is in defining the direction of linear search. The conjugate gradient method has several methods to determine the steep size such as the Fletcher-Reeves, Dixon, Polak-Ribere, Hestene Steifel, and Dai-Yuan methods by using discrete electrocardiogram data. Conjugate gradients are used to update learning rates on neural networks by using different steep sizes. While the gradient search direction is used to update the weight on the NN. The results show that using Polak-Ribere get an optimal error, but the direction of the weighting search on NN widens and causes epoch on NN training is getting longer. But Hestene Steifel, and Dai-Yua could not find the gradient search direction so they could not update the weights and cause errors and epochs to infinity.


Sign in / Sign up

Export Citation Format

Share Document