scholarly journals New Investigation for the Liu-Story Scaled Conjugate Gradient Method for Nonlinear Optimization

2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Eman T. Hamed ◽  
Rana Z. Al-Kawaz ◽  
Abbas Y. Al-Bayati

This article considers modified formulas for the standard conjugate gradient (CG) technique that is planned by Li and Fukushima. A new scalar parameter θkNew for this CG technique of unconstrained optimization is planned. The descent condition and global convergent property are established below using strong Wolfe conditions. Our numerical experiments show that the new proposed algorithms are more stable and economic as compared to some well-known standard CG methods.

2019 ◽  
Vol 24 (5) ◽  
pp. 86
Author(s):  
Zeyad M. Abdullah1 ◽  
Hameed M, Sadeq2 ◽  
, Hisham M, Azzam3 ◽  
Mundher A. Khaleel4

The current paper modified method of conjugate gradient for solving problems of unconstrained optimization. The modified method convergence is achieved by assuming some hypotheses. The statistical results demonstrate that the modified method is efficient for solving problems of Unconstrained Nonlinear Optimization in comparison with methods FR and HS.   http://dx.doi.org/10.25130/tjps.24.2019.095


2013 ◽  
Vol 2013 ◽  
pp. 1-8
Author(s):  
Yuanying Qiu ◽  
Dandan Cui ◽  
Wei Xue ◽  
Gaohang Yu

This paper presents a hybrid spectral conjugate gradient method for large-scale unconstrained optimization, which possesses a self-adjusting property. Under the standard Wolfe conditions, its global convergence result is established. Preliminary numerical results are reported on a set of large-scale problems in CUTEr to show the convergence and efficiency of the proposed method.


2020 ◽  
Vol 9 (2) ◽  
pp. 101-105
Author(s):  
Hussein Ageel Khatab ◽  
Salah Gazi Shareef

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.


2019 ◽  
Vol 7 (1) ◽  
pp. 34-36
Author(s):  
Alaa L. Ibrahim ◽  
Muhammad A. Sadiq ◽  
Salah G. Shareef

This paper, proposes a new conjugate gradient method for unconstrained optimization based on Dai-Liao (DL) formula; descent condition and sufficient descent condition for our method are provided. The numerical results and comparison show that the proposed algorithm is potentially efficient when we compare with (PR) depending on number of iterations (NOI) and the number of functions evaluation (NOF).


Sign in / Sign up

Export Citation Format

Share Document