scholarly journals A Modified Fletcher–Reeves Conjugate Gradient Method for Monotone Nonlinear Equations with Some Applications

Mathematics ◽  
2019 ◽  
Vol 7 (8) ◽  
pp. 745 ◽  
Author(s):  
Auwal Bala Abubakar ◽  
Poom Kumam ◽  
Hassan Mohammad ◽  
Aliyu Muhammed Awwal ◽  
Kanokwan Sitthithakerngkiet

One of the fastest growing and efficient methods for solving the unconstrained minimization problem is the conjugate gradient method (CG). Recently, considerable efforts have been made to extend the CG method for solving monotone nonlinear equations. In this research article, we present a modification of the Fletcher–Reeves (FR) conjugate gradient projection method for constrained monotone nonlinear equations. The method possesses sufficient descent property and its global convergence was proved using some appropriate assumptions. Two sets of numerical experiments were carried out to show the good performance of the proposed method compared with some existing ones. The first experiment was for solving monotone constrained nonlinear equations using some benchmark test problem while the second experiment was applying the method in signal and image recovery problems arising from compressive sensing.

2011 ◽  
Vol 58-60 ◽  
pp. 943-949
Author(s):  
Wan You Cheng ◽  
Xue Jie Liu

In this paper, on the basis of the recently developed HZ (Hager-Zhang) method [SIAM J. Optim., 16 (2005), pp. 170-192], we propose a hybrid descent conjugate gradient method which reserves the sufficient descent property of the HZ method. Under suitable conditions, we prove the global convergence of the proposed method. Extensive numerical experiments show that the method is promising for the test problems from the CUTE library.


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


2014 ◽  
Vol 70 (2) ◽  
pp. 269-286
Author(s):  
Hao Liu ◽  
Haijun Wang ◽  
Xiaoyan Qian ◽  
Feng Rao

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Hongbo Guan ◽  
Sheng Wang

In this paper, we propose a modified Polak–Ribière–Polyak (PRP) conjugate gradient method for solving large-scale nonlinear equations. Under weaker conditions, we show that the proposed method is globally convergent. We also carry out some numerical experiments to test the proposed method. The results show that the proposed method is efficient and stable.


Sign in / Sign up

Export Citation Format

Share Document