An improved nonlinear conjugate gradient method with an optimal property

2014 ◽  
Vol 57 (3) ◽  
pp. 635-648 ◽  
Author(s):  
CaiXia Kou
2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Xiwen Lu ◽  
Bin Qin

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. In this paper, we propose a new conjugacy condition which is similar to Dai-Liao (2001). Based on this condition, the related nonlinear conjugate gradient method is given. With some mild conditions, the given method is globally convergent under the strong Wolfe-Powell line search for general functions. The numerical experiments show that the proposed method is very robust and efficient.


Filomat ◽  
2021 ◽  
Vol 35 (3) ◽  
pp. 737-758
Author(s):  
Yue Hao ◽  
Shouqiang Du ◽  
Yuanyuan Chen

In this paper, we consider the method for solving the finite minimax problems. By using the exponential penalty function to smooth the finite minimax problems, a new three-term nonlinear conjugate gradient method is proposed for solving the finite minimax problems, which generates sufficient descent direction at each iteration. Under standard assumptions, the global convergence of the proposed new three-term nonlinear conjugate gradient method with Armijo-type line search is established. Numerical results are given to illustrate that the proposed method can efficiently solve several kinds of optimization problems, including the finite minimax problem, the finite minimax problem with tensor structure, the constrained optimization problem and the constrained optimization problem with tensor structure.


Sign in / Sign up

Export Citation Format

Share Document