scholarly journals The convergence properties of a new hybrid conjugate gradient parameter for unconstrained optimization models

2021 ◽  
Vol 1734 ◽  
pp. 012012
Author(s):  
I M Sulaiman ◽  
M Mamat ◽  
M Y Waziri ◽  
U A Yakubu ◽  
M Malik
2020 ◽  
Vol 22 (03) ◽  
pp. 204-215
Author(s):  
I. M. Sulaiman ◽  
M. Mamat ◽  
A. E. Owoyemi ◽  
P. L. Ghazali ◽  
M. Rivaie ◽  
...  

2015 ◽  
Vol 9 ◽  
pp. 1845-1856 ◽  
Author(s):  
Rabi’u Bashir Yunus ◽  
Mustafa Mamat ◽  
Abdelrahman Abashar ◽  
Mohd Rivaie ◽  
Zabidin Salleh ◽  
...  

2021 ◽  
Vol 6 (10) ◽  
pp. 10742-10764
Author(s):  
Ibtisam A. Masmali ◽  
◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
◽  
...  

<abstract> <p>The conjugate gradient (CG) method is a method to solve unconstrained optimization problems. Moreover CG method can be applied in medical science, industry, neural network, and many others. In this paper a new three term CG method is proposed. The new CG formula is constructed based on DL and WYL CG formulas to be non-negative and inherits the properties of HS formula. The new modification satisfies the convergence properties and the sufficient descent property. The numerical results show that the new modification is more efficient than DL, WYL, and CG-Descent formulas. We use more than 200 functions from CUTEst library to compare the results between these methods in term of number of iterations, function evaluations, gradient evaluations, and CPU time.</p> </abstract>


Author(s):  
Ibrahim Mohammed Sulaiman ◽  
Norsuhaily Abu Bakar ◽  
Mustafa Mamat ◽  
Basim A. Hassan ◽  
Maulana Malik ◽  
...  

The hybrid conjugate gradient (CG) method is among the efficient variants of CG method for solving optimization problems. This is due to their low memory requirements and nice convergence properties. In this paper, we present an efficient hybrid CG method for solving unconstrained optimization models and show that the method satisfies the sufficient descent condition. The global convergence prove of the proposed method would be established under inexact line search. Application of the proposed method to the famous statistical regression model describing the global outbreak of the novel COVID-19 is presented. The study parameterized the model using the weekly increase/decrease of recorded cases from December 30, 2019 to March 30, 2020. Preliminary numerical results on some unconstrained optimization problems show that the proposed method is efficient and promising. Furthermore, the proposed method produced a good regression equation for COVID-19 confirmed cases globally.


2018 ◽  
Vol 7 (2.14) ◽  
pp. 21
Author(s):  
Omar Alshorman ◽  
Mustafa Mamat ◽  
Ahmad Alhawarat ◽  
Mohd Revaie

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.  


Author(s):  
Awad Abdelrahman ◽  
Osman Yousif ◽  
Mogtaba Mhammed ◽  
Murtada Elbashir

Nonlinear conjugate gradient (CG) methods are significant for solving large-scale, unconstrained optimization problems, providing vital knowledge to determine the minimum point or optimize the objective functions. Many studies of modifications for nonlinear CG methods have been carried out to improve the performance of numerical computation and to establish global convergence properties. One of these studies is the modified CG method, which has been proposed by Rivaie et al. (2015). In this paper, we modify their work in such a way that one can obtain efficient numerical performance and global convergence properties. Due to the widespread use of the strong Wolfe line search in practice, our proposed modified method implemented its use. At the same time, to show the performance of the modified method in practice, a numerical experiment is performed. KEYWORDS Unconstrained optimization; conjugate gradient method; sufficient descent property; global convergence


Sign in / Sign up

Export Citation Format

Share Document