scholarly journals A New conjugate gradient method for unconstrained optimization problems with descent property

2020 ◽  
Vol 9 (2) ◽  
pp. 101-105
Author(s):  
Hussein Ageel Khatab ◽  
Salah Gazi Shareef

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.

2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
Guanghui Zhou ◽  
Qin Ni

A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems.


2019 ◽  
Vol 38 (7) ◽  
pp. 227-231
Author(s):  
Huda Younus Najm ◽  
Eman T. Hamed ◽  
Huda I. Ahmed

In this study, we propose a new parameter in the conjugate gradient method. It is shown that the new method fulfils the sufficient descent condition with the strong Wolfe condition when inexact line search has been used. The numerical results of this suggested method also shown that this method outperforms to other standard conjugate gradient method.


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
Yuting Chen ◽  
Mingyuan Cao ◽  
Yueting Yang

AbstractIn this paper, we present a new conjugate gradient method using an acceleration scheme for solving large-scale unconstrained optimization. The generated search direction satisfies both the sufficient descent condition and the Dai–Liao conjugacy condition independent of line search. Moreover, the value of the parameter contains more useful information without adding more computational cost and storage requirements, which can improve the numerical performance. Under proper assumptions, the global convergence result of the proposed method with a Wolfe line search is established. Numerical experiments show that the given method is competitive for unconstrained optimization problems, with a maximum dimension of 100,000.


2016 ◽  
Vol 21 (3) ◽  
pp. 399-411 ◽  
Author(s):  
XiaoLiang Dong ◽  
HongWei Liu ◽  
YuBo He ◽  
Saman Babaie-Kafaki ◽  
Reza Ghanbari

In this paper, we propose a three–term PRP–type conjugate gradient method which always satisfies the sufficient descent condition independently of line searches employed. An important property of our method is that its direction is closest to the direction of the Newton method or satisfies conjugacy condition as the iterations evolve. In addition, under mild condition, we prove global convergence properties of the proposed method. Numerical comparison illustrates that our proposed method is efficient for solving the optimization problems.


Sign in / Sign up

Export Citation Format

Share Document