unconstrained optimisation
Recently Published Documents


TOTAL DOCUMENTS

33
(FIVE YEARS 7)

H-INDEX

6
(FIVE YEARS 0)

2022 ◽  
Author(s):  
J J McKeown ◽  
D Meegan ◽  
D Sprevak

2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Ahmad Alhawarat ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Zabidin Salleh

The conjugate gradient method is a useful method to solve large-scale unconstrained optimisation problems and to be used in some applications in several fields such as engineering, medical science, image restorations, neural network, and many others. The main benefit of the conjugate gradient method is not using the second derivative or its approximation, such as Newton’s method or its approximation. Moreover, the algorithm of the conjugate gradient method is simple and easy to apply. This study proposes a new modified conjugate gradient method that contains four terms depending on popular two- and three-term conjugate gradient methods. The new algorithm satisfies the descent condition. In addition, the new CG algorithm possesses the convergence property. In the numerical results part, we compare the new algorithm with famous methods such as CG-Descent. We conclude from numerical results that the new algorithm is more efficient than other popular CG methods such as CG-Descent 6.8 in terms of number of function evaluations, number of gradient evaluations, number of iterations, and CPU time.


Author(s):  
Christina D. Nikolakakou ◽  
Athanasia N. Papanikolaou ◽  
Eirini I. Nikolopoulou ◽  
Theodoula N. Grapsa ◽  
George S. Androulakis

Author(s):  
Christina D. Nikolakakou ◽  
George S. Androulakis ◽  
Theodoula N. Grapsa ◽  
Eirini I. Nikolopoulou ◽  
Athanasia N. Papanikolaou

Sign in / Sign up

Export Citation Format

Share Document