descent condition
Recently Published Documents


TOTAL DOCUMENTS

37
(FIVE YEARS 5)

H-INDEX

8
(FIVE YEARS 0)

2021 ◽  
Vol 11 (1) ◽  
pp. 1-9
Author(s):  
Ahmed Anwer Mustafa ◽  
Salah Gazi Shareef

In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient method of solving unconstrained optimization problems based on three terms and step size of cubic. Our new proposed CG method has descent condition, sufficient descent condition, conjugacy condition, and global convergence properties. Numerical comparisons with two standard conjugate gradient algorithms show that this algorithm is very effective depending on the number of iterations and the number of functions evaluated.



2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Ahmad Alhawarat ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Zabidin Salleh

The conjugate gradient method is a useful method to solve large-scale unconstrained optimisation problems and to be used in some applications in several fields such as engineering, medical science, image restorations, neural network, and many others. The main benefit of the conjugate gradient method is not using the second derivative or its approximation, such as Newton’s method or its approximation. Moreover, the algorithm of the conjugate gradient method is simple and easy to apply. This study proposes a new modified conjugate gradient method that contains four terms depending on popular two- and three-term conjugate gradient methods. The new algorithm satisfies the descent condition. In addition, the new CG algorithm possesses the convergence property. In the numerical results part, we compare the new algorithm with famous methods such as CG-Descent. We conclude from numerical results that the new algorithm is more efficient than other popular CG methods such as CG-Descent 6.8 in terms of number of function evaluations, number of gradient evaluations, number of iterations, and CPU time.



2021 ◽  
Vol 5 (1) ◽  
pp. 47
Author(s):  
Sindy Devila ◽  
Maulana Malik ◽  
Wed Giyarti

In this paper, we propose a new hybrid coefficient of conjugate gradient method (CG) for solving unconstrained optimization model.  The new coefficient is combination of part the MMSIS (Malik et.al, 2020) and PRP (Polak, Ribi'ere \& Polyak, 1969) coefficients.  Under exact line search, the search direction of new method satisfies the sufficient descent condition and based on certain assumption, we establish the global convergence properties.  Using some test functions, numerical results show that the proposed method is more efficient than MMSIS method.  Besides, the new method can be used to solve problem in minimizing portfolio selection risk .



2021 ◽  
Vol 26 (2) ◽  
pp. 32
Author(s):  
Stefan Banholzer ◽  
Bennet Gebken ◽  
Lena Reichle ◽  
Stefan Volkwein

The goal in multiobjective optimization is to determine the so-called Pareto set. Our optimization problem is governed by a parameter-dependent semi-linear elliptic partial differential equation (PDE). To solve it, we use a gradient-based set-oriented numerical method. The numerical solution of the PDE by standard discretization methods usually leads to high computational effort. To overcome this difficulty, reduced-order modeling (ROM) is developed utilizing the reduced basis method. These model simplifications cause inexactness in the gradients. For that reason, an additional descent condition is proposed. Applying a modified subdivision algorithm, numerical experiments illustrate the efficiency of our solution approach.



2020 ◽  
Vol 9 (2) ◽  
pp. 101-105
Author(s):  
Hussein Ageel Khatab ◽  
Salah Gazi Shareef

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.



Author(s):  
Basim Abbas Hassan ◽  
Ahmed Obeid Owaid ◽  
Zena T. Yasen

<p><span>On some studies a conjugate parameter plays an important role for the conjugate gradient methods. In this paper, a variant of hybrid is provided in the search direction based on the convex combination. This search direction ensures that the descent condition holds. The global convergence of the variant of hybrid is also obtained. Our strong evidence is a numerical analysis showing that the proposed variant of hybrid method is efficient than the Hestenes and Stiefel method. </span></p>



2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Eman T. Hamed ◽  
Rana Z. Al-Kawaz ◽  
Abbas Y. Al-Bayati

This article considers modified formulas for the standard conjugate gradient (CG) technique that is planned by Li and Fukushima. A new scalar parameter θkNew for this CG technique of unconstrained optimization is planned. The descent condition and global convergent property are established below using strong Wolfe conditions. Our numerical experiments show that the new proposed algorithms are more stable and economic as compared to some well-known standard CG methods.



2019 ◽  
Vol 8 (4) ◽  
pp. 11464-11467

Spectral conjugate gradient method has been used in most cases as an alternative to the conjugate gradient (CG) method in order to solve nonlinear unconstrained problems. In this paper, we introduced a spectral parameter of HS conjugate gradient method resultant from the classical CG search direction and used some of the standard test functions with numerous variables to prove its sufficient descent and global convergence properties, the numerical outcome is verified by exact line search procedures.



Sign in / Sign up

Export Citation Format

Share Document