GLOBAL CONVERGENCE OF A SPECIAL CASE OF THE DAI–YUAN FAMILY WITHOUT LINE SEARCH

2008 ◽  
Vol 25 (03) ◽  
pp. 411-420 ◽  
Author(s):  
HUI ZHU ◽  
XIONGDA CHEN

Conjugate gradient methods are efficient to minimize differentiable objective functions in large dimension spaces. Recently, Dai and Yuan introduced a tree-parameter family of nonlinear conjugate gradient methods and show their convergence. However, line search strategies usually bring computational burden. To overcome this problem, in this paper, we study the global convergence of a special case of three-parameter family(the CD-DY family) in which the line search procedures are replaced by fixed formulae of stepsize.

2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Yuan-Yuan Chen ◽  
Shou-Qiang Du

Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems. Under some mild assumptions, the global convergence results of the given methods are proposed. The numerical results show that the nonlinear conjugate gradient methods with Wolfe type line search are efficient for some unconstrained optimization problems.


Sign in / Sign up

Export Citation Format

Share Document