A scaled conjugate gradient method for nonlinear unconstrained optimization

2016 ◽  
Vol 32 (5) ◽  
pp. 1095-1112 ◽  
Author(s):  
Masoud Fatemi
2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Eman T. Hamed ◽  
Rana Z. Al-Kawaz ◽  
Abbas Y. Al-Bayati

This article considers modified formulas for the standard conjugate gradient (CG) technique that is planned by Li and Fukushima. A new scalar parameter θkNew for this CG technique of unconstrained optimization is planned. The descent condition and global convergent property are established below using strong Wolfe conditions. Our numerical experiments show that the new proposed algorithms are more stable and economic as compared to some well-known standard CG methods.


2020 ◽  
Vol 9 (2) ◽  
pp. 101-105
Author(s):  
Hussein Ageel Khatab ◽  
Salah Gazi Shareef

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.


2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
Guanghui Zhou ◽  
Qin Ni

A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems.


Sign in / Sign up

Export Citation Format

Share Document