scholarly journals A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search

2011 ◽  
Vol 16 (4) ◽  
pp. 1157-1169 ◽  
Author(s):  
Zhong Wan ◽  
◽  
Chaoming Hu ◽  
Zhanlu Yang
2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Yuan-Yuan Chen ◽  
Shou-Qiang Du

Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems. Under some mild assumptions, the global convergence results of the given methods are proposed. The numerical results show that the nonlinear conjugate gradient methods with Wolfe type line search are efficient for some unconstrained optimization problems.


2012 ◽  
Vol 2012 ◽  
pp. 1-14 ◽  
Author(s):  
Kin Wei Ng ◽  
Ahmad Rohanin

We present the numerical solutions for the PDE-constrained optimization problem arising in cardiac electrophysiology, that is, the optimal control problem of monodomain model. The optimal control problem of monodomain model is a nonlinear optimization problem that is constrained by the monodomain model. The monodomain model consists of a parabolic partial differential equation coupled to a system of nonlinear ordinary differential equations, which has been widely used for simulating cardiac electrical activity. Our control objective is to dampen the excitation wavefront using optimal applied extracellular current. Two hybrid conjugate gradient methods are employed for computing the optimal applied extracellular current, namely, the Hestenes-Stiefel-Dai-Yuan (HS-DY) method and the Liu-Storey-Conjugate-Descent (LS-CD) method. Our experiment results show that the excitation wavefronts are successfully dampened out when these methods are used. Our experiment results also show that the hybrid conjugate gradient methods are superior to the classical conjugate gradient methods when Armijo line search is used.


Sign in / Sign up

Export Citation Format

Share Document