scholarly journals A new conjugate gradient algorithms using conjugacy condition for solving unconstrained optimization

Author(s):  
Aseel M. Qasim ◽  
Zinah F. Salih ◽  
Basim A. Hassan

The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.

2014 ◽  
Vol 11 (04) ◽  
pp. 1350092 ◽  
Author(s):  
SAMAN BABAIE-KAFAKI

In an attempt to find a reasonable solution for an open problem propounded by Andrei in nonlinear conjugate gradient methods, an adaptive conjugacy condition is proposed. The suggested condition is designed based on an implicit switch from a conjugacy condition to the standard secant equation, using an extended conjugacy condition proposed by Dai and Liao. Following the approach of Dai and Liao, two adaptive nonlinear conjugate gradient methods are proposed based on the suggested adaptive conjugacy condition. An interesting feature of one of the proposed methods is the adaptive switch between the nonlinear conjugate gradient methods proposed by Hestenes and Stiefel, and Perry. Under proper conditions, it is shown that one of the proposed methods is globally convergent for uniformly convex functions and the other is globally convergent for general functions. Numerical results demonstrating the effectiveness of the proposed adaptive approach in the sense of the performance profile introduced by Dolan and Moré are reported.


2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Yuan-Yuan Chen ◽  
Shou-Qiang Du

Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems. Under some mild assumptions, the global convergence results of the given methods are proposed. The numerical results show that the nonlinear conjugate gradient methods with Wolfe type line search are efficient for some unconstrained optimization problems.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Zhongbo Sun ◽  
Yantao Tian ◽  
Hongyang Li

Two modified three-term type conjugate gradient algorithms which satisfy both the descent condition and the Dai-Liao type conjugacy condition are presented for unconstrained optimization. The first algorithm is a modification of the Hager and Zhang type algorithm in such a way that the search direction is descent and satisfies Dai-Liao’s type conjugacy condition. The second simple three-term type conjugate gradient method can generate sufficient decent directions at every iteration; moreover, this property is independent of the steplength line search. Also, the algorithms could be considered as a modification of the MBFGS method, but with differentzk. Under some mild conditions, the given methods are global convergence, which is independent of the Wolfe line search for general functions. The numerical experiments show that the proposed methods are very robust and efficient.


Sign in / Sign up

Export Citation Format

Share Document