scholarly journals Enhanced Derivative-Free Conjugate Gradient Method for Solving Symmetric Nonlinear Equations

Author(s):  
Jamilu Sabi'u

In this article, an enhanced conjugate gradient approach for solving symmetric nonlinear equations is propose without computing the Jacobian matrix. This approach is completely derivative and matrix free. Using classical assumptions the proposed method has global convergence with nonmonotone line search. Some reported numerical results shows the approach is promising.<p style="margin: 0px; text-indent: 0px; -qt-block-indent: 0; -qt-paragraph-type: empty;"> </p>

2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


2021 ◽  
Vol 24 (02) ◽  
pp. 147-164
Author(s):  
Abdulkarim Hassan Ibrahim ◽  
Kanikar Muangchoo ◽  
Nur Syarafina Mohamed ◽  
Auwal Bala Abubakar

2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Xiangfei Yang ◽  
Zhijun Luo ◽  
Xiaoyu Dai

Conjugate gradient method is one of the most effective algorithms for solving unconstrained optimization problem. In this paper, a modified conjugate gradient method is presented and analyzed which is a hybridization of known LS and CD conjugate gradient algorithms. Under some mild conditions, the Wolfe-type line search can guarantee the global convergence of the LS-CD method. The numerical results show that the algorithm is efficient.


Sign in / Sign up

Export Citation Format

Share Document