nonlinear conjugate gradient method
Recently Published Documents


TOTAL DOCUMENTS

82
(FIVE YEARS 18)

H-INDEX

13
(FIVE YEARS 2)

Author(s):  
Amina Boumediene ◽  
Tahar Bechouat ◽  
Rachid Benzine ◽  
Ghania Hadji

The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Adsadang Himakalasa ◽  
Suttida Wongkaew

The Cucker and Smale model is a well-known flocking model that describes the emergence of flocks based on alignment. The first part focuses on investigating this model, including the effect of time delay and the presence of a leader. Furthermore, the control function is inserted into the dynamics of a leader to drive a group of agents to target. In the second part of this work, leadership-based optimal control is investigated. Moreover, the existence of the first-order optimality conditions for a delayed optimal control problem is discussed. Furthermore, the Runge–Kutta discretization method and the nonlinear conjugate gradient method are employed to solve the discrete optimality system. Finally, the capacity of the proposed control approach to drive a group of agents to reach the desired places or track the trajectory is demonstrated by numerical experiment results.


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.


Filomat ◽  
2021 ◽  
Vol 35 (3) ◽  
pp. 737-758
Author(s):  
Yue Hao ◽  
Shouqiang Du ◽  
Yuanyuan Chen

In this paper, we consider the method for solving the finite minimax problems. By using the exponential penalty function to smooth the finite minimax problems, a new three-term nonlinear conjugate gradient method is proposed for solving the finite minimax problems, which generates sufficient descent direction at each iteration. Under standard assumptions, the global convergence of the proposed new three-term nonlinear conjugate gradient method with Armijo-type line search is established. Numerical results are given to illustrate that the proposed method can efficiently solve several kinds of optimization problems, including the finite minimax problem, the finite minimax problem with tensor structure, the constrained optimization problem and the constrained optimization problem with tensor structure.


2020 ◽  
Vol 2020 ◽  
pp. 1-14
Author(s):  
Mengxiang Zhang ◽  
Yingjie Zhou ◽  
Songhua Wang

In this article, a modified Polak-Ribière-Polyak (PRP) conjugate gradient method is proposed for image restoration. The presented method can generate sufficient descent directions without any line search conditions. Under some mild conditions, this method is globally convergent with the Armijo line search. Moreover, the linear convergence rate of the modified PRP method is established. The experimental results of unconstrained optimization, image restoration, and compressive sensing show that the proposed method is promising and competitive with other conjugate gradient methods.


Sign in / Sign up

Export Citation Format

Share Document