A new convergent conjugate gradient method under the exact line search

2015 ◽  
Author(s):  
Osman Omer ◽  
Mustafa Mamat ◽  
Mohd Rivaie
Author(s):  
Nur Syarafina Mohamed ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Shazlyn Milleana Shaharudin

One of the popular approaches in modifying the Conjugate Gradient (CG) Method is hybridization. In this paper, a new hybrid CG is introduced and its performance is compared to the classical CG method which are Rivaie-Mustafa-Ismail-Leong (RMIL) and Syarafina-Mustafa-Rivaie (SMR) methods. The proposed hybrid CG is evaluated as a convex combination of RMIL and SMR method. Their performance are analyzed under the exact line search. The comparison performance showed that the hybrid CG is promising and has outperformed the classical CG of RMIL and SMR in terms of the number of iterations and central processing unit per time.


2021 ◽  
Vol 5 (1) ◽  
pp. 47
Author(s):  
Sindy Devila ◽  
Maulana Malik ◽  
Wed Giyarti

In this paper, we propose a new hybrid coefficient of conjugate gradient method (CG) for solving unconstrained optimization model.  The new coefficient is combination of part the MMSIS (Malik et.al, 2020) and PRP (Polak, Ribi'ere \& Polyak, 1969) coefficients.  Under exact line search, the search direction of new method satisfies the sufficient descent condition and based on certain assumption, we establish the global convergence properties.  Using some test functions, numerical results show that the proposed method is more efficient than MMSIS method.  Besides, the new method can be used to solve problem in minimizing portfolio selection risk .


2011 ◽  
Vol 18 (9) ◽  
pp. 1249-1253 ◽  
Author(s):  
Mehdi Dehghan ◽  
Masoud Hajarian

The conjugate gradient method is one of the most useful and the earliest-discovered techniques for solving large-scale nonlinear optimization problems. Many variants of this method have been proposed, and some are widely used in practice. In this article, we study the descent Dai–Yuan conjugate gradient method which guarantees the sufficient descent condition for any line search. With exact line search, the introduced conjugate gradient method reduces to the Dai–Yuan conjugate gradient method. Finally, a global convergence result is established when the line search fulfils the Goldstein conditions.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


2015 ◽  
Vol 9 ◽  
pp. 4799-4812 ◽  
Author(s):  
Syazni Shoid ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Zabidin Salleh

Author(s):  
Nur Syarafina Mohamed ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Shazlyn Milleana Shaharudin

<p><span>Hybridization is one of the popular approaches in modifying the conjugate gradient method. In this paper, a new hybrid conjugate gradient is suggested and analyzed in which the parameter <!--[if gte mso 9]><xml> <o:OLEObject Type="Embed" ProgID="Equation.3" ShapeID="_x0000_i1025" DrawAspect="Content" ObjectID="_1640083713"> </o:OLEObject> </xml><![endif]-->is evaluated as a convex combination of <!--[if gte mso 9]><xml> <o:OLEObject Type="Embed" ProgID="Equation.3" ShapeID="_x0000_i1026" DrawAspect="Content" ObjectID="_1640083714"> </o:OLEObject> </xml><![endif]--> while using exact line search. The proposed method is shown to possess both sufficient descent and global convergence properties. Numerical performances show that the proposed method is promising and has overpowered other hybrid conjugate gradient methods in its number of iterations and central processing unit per time. </span></p>


2019 ◽  
Vol 8 (4) ◽  
pp. 11464-11467

Spectral conjugate gradient method has been used in most cases as an alternative to the conjugate gradient (CG) method in order to solve nonlinear unconstrained problems. In this paper, we introduced a spectral parameter of HS conjugate gradient method resultant from the classical CG search direction and used some of the standard test functions with numerous variables to prove its sufficient descent and global convergence properties, the numerical outcome is verified by exact line search procedures.


Sign in / Sign up

Export Citation Format

Share Document