nonlinear conjugate gradient
Recently Published Documents


TOTAL DOCUMENTS

185
(FIVE YEARS 39)

H-INDEX

22
(FIVE YEARS 2)

Author(s):  
Aseel M. Qasim ◽  
Zinah F. Salih ◽  
Basim A. Hassan

The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.


Author(s):  
Amina Boumediene ◽  
Tahar Bechouat ◽  
Rachid Benzine ◽  
Ghania Hadji

The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Adsadang Himakalasa ◽  
Suttida Wongkaew

The Cucker and Smale model is a well-known flocking model that describes the emergence of flocks based on alignment. The first part focuses on investigating this model, including the effect of time delay and the presence of a leader. Furthermore, the control function is inserted into the dynamics of a leader to drive a group of agents to target. In the second part of this work, leadership-based optimal control is investigated. Moreover, the existence of the first-order optimality conditions for a delayed optimal control problem is discussed. Furthermore, the Runge–Kutta discretization method and the nonlinear conjugate gradient method are employed to solve the discrete optimality system. Finally, the capacity of the proposed control approach to drive a group of agents to reach the desired places or track the trajectory is demonstrated by numerical experiment results.


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.


Sign in / Sign up

Export Citation Format

Share Document