scholarly journals A Descent Four-Term Conjugate Gradient Method with Global Convergence Properties for Large-Scale Unconstrained Optimisation Problems

2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Ahmad Alhawarat ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Zabidin Salleh

The conjugate gradient method is a useful method to solve large-scale unconstrained optimisation problems and to be used in some applications in several fields such as engineering, medical science, image restorations, neural network, and many others. The main benefit of the conjugate gradient method is not using the second derivative or its approximation, such as Newton’s method or its approximation. Moreover, the algorithm of the conjugate gradient method is simple and easy to apply. This study proposes a new modified conjugate gradient method that contains four terms depending on popular two- and three-term conjugate gradient methods. The new algorithm satisfies the descent condition. In addition, the new CG algorithm possesses the convergence property. In the numerical results part, we compare the new algorithm with famous methods such as CG-Descent. We conclude from numerical results that the new algorithm is more efficient than other popular CG methods such as CG-Descent 6.8 in terms of number of function evaluations, number of gradient evaluations, number of iterations, and CPU time.

2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


Algorithms ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 227
Author(s):  
Zabidin Salleh ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Ahmad Alhawarat

The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require the second derivative, such as Newton’s method or approximations. Moreover, the conjugate gradient method can be applied in many fields such as neural networks, image restoration, etc. Many complicated methods are proposed to solve these optimization functions in two or three terms. In this paper, we propose a simple, easy, efficient, and robust conjugate gradient method. The new method is constructed based on the Liu and Storey method to overcome the convergence problem and descent property. The new modified method satisfies the convergence properties and the sufficient descent condition under some assumptions. The numerical results show that the new method outperforms famous CG methods such as CG-Descent5.3, Liu and Storey, and Dai and Liao. The numerical results include the number of iterations and CPU time.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Junxia Hou ◽  
Quanyi Lv ◽  
Manyu Xiao

Computational effort of solving large-scale Sylvester equationsAX+XB+F=Ois frequently hindered in dealing with many complex control problems. In this work, a parallel preconditioned algorithm for solving it is proposed based on combination of a parameter iterative preconditioned method and modified form of conjugate gradient (MCG) method. Furthermore, Schur’s inequality and modified conjugate gradient method are employed to overcome the involved difficulties such as determination of parameter and calculation of inverse matrix. Several numerical results finally show that high performance of proposed parallel algorithm is obtained both in convergent rate and in parallel efficiency.


2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Hongbo Guan ◽  
Sheng Wang

In this paper, we propose a modified Polak–Ribière–Polyak (PRP) conjugate gradient method for solving large-scale nonlinear equations. Under weaker conditions, we show that the proposed method is globally convergent. We also carry out some numerical experiments to test the proposed method. The results show that the proposed method is efficient and stable.


2019 ◽  
Vol 7 (1) ◽  
pp. 34-36
Author(s):  
Alaa L. Ibrahim ◽  
Muhammad A. Sadiq ◽  
Salah G. Shareef

This paper, proposes a new conjugate gradient method for unconstrained optimization based on Dai-Liao (DL) formula; descent condition and sufficient descent condition for our method are provided. The numerical results and comparison show that the proposed algorithm is potentially efficient when we compare with (PR) depending on number of iterations (NOI) and the number of functions evaluation (NOF).


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Minglei Fang ◽  
Min Wang ◽  
Min Sun ◽  
Rong Chen

The nonlinear conjugate gradient algorithms are a very effective way in solving large-scale unconstrained optimization problems. Based on some famous previous conjugate gradient methods, a modified hybrid conjugate gradient method was proposed. The proposed method can generate decent directions at every iteration independent of any line search. Under the Wolfe line search, the proposed method possesses global convergence. Numerical results show that the modified method is efficient and robust.


Sign in / Sign up

Export Citation Format

Share Document