scholarly journals A Modified Conjugacy Condition and Related Nonlinear Conjugate Gradient Method

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Xiwen Lu ◽  
Bin Qin

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. In this paper, we propose a new conjugacy condition which is similar to Dai-Liao (2001). Based on this condition, the related nonlinear conjugate gradient method is given. With some mild conditions, the given method is globally convergent under the strong Wolfe-Powell line search for general functions. The numerical experiments show that the proposed method is very robust and efficient.

2018 ◽  
Vol 13 (03) ◽  
pp. 2050059
Author(s):  
Amina Boumediene ◽  
Rachid Benzine ◽  
Mohammed Belloufi

Nonlinear conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to develop and improve these methods. In this paper, we aim to study the global convergence of the BBB conjugate gradient method with exact line search.


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

<span>The conjugate gradient method has played a special role in   solving large-scale unconstrained Optimization problems. In this paper, we propose a new family of CG coefficients that   possess    sufficient descent conditions and global convergence properties this CG method is similar to (Wei et al) [7].  Global convergence   result is established under Strong Wolf-Powell line search. Numerical results to find  the  optimum solution of some test  functions show the   new proposed formula has the best result in CPU time and the number of iterations, and the number of gradient evaluations when it comparing with FR, PRP, DY, and WYL </span>


2018 ◽  
Vol 29 (1) ◽  
pp. 133
Author(s):  
Basim A. Hassan ◽  
Haneen A. Alashoor

The nonlinear conjugate gradient method is widely used to solve unconstrained optimization problems. In this paper the development of different versions of nonlinear conjugate gradient methods with global convergence properties proved. Numerical results indicated that the proposed method is very efficient.


2014 ◽  
Vol 989-994 ◽  
pp. 2406-2409
Author(s):  
Ting Feng Li ◽  
Zhi Yuan Liu ◽  
Zhao Bin Du

In this paper, we introduce an algorithm for solving large-scale box-constrained optimization problems. At each iteration of the proposed algorithm, we first estimate the active set by means of an active set identification technique. The components of the search direction corresponding to the active set are simply defined; the other components are determined by nonlinear conjugate gradient method. Under some additional conditions, we show that the algorithm converges globally. We also report some preliminary numerical experiments to show that the proposed algorithm is practicable and effective for the test problems.


Sign in / Sign up

Export Citation Format

Share Document