global convergence property
Recently Published Documents


TOTAL DOCUMENTS

30
(FIVE YEARS 15)

H-INDEX

4
(FIVE YEARS 1)

2022 ◽  
Vol 2022 (1) ◽  
Author(s):  
Zabidin Salleh ◽  
Adel Almarashi ◽  
Ahmad Alhawarat

AbstractThe conjugate gradient method can be applied in many fields, such as neural networks, image restoration, machine learning, deep learning, and many others. Polak–Ribiere–Polyak and Hestenses–Stiefel conjugate gradient methods are considered as the most efficient methods to solve nonlinear optimization problems. However, both methods cannot satisfy the descent property or global convergence property for general nonlinear functions. In this paper, we present two new modifications of the PRP method with restart conditions. The proposed conjugate gradient methods satisfy the global convergence property and descent property for general nonlinear functions. The numerical results show that the new modifications are more efficient than recent CG methods in terms of number of iterations, number of function evaluations, number of gradient evaluations, and CPU time.


Author(s):  
Ghada M. Al-Naemi ◽  
Ahmed H. Sheekoo

<p>A new scaled conjugate gradient (SCG) method is proposed throughout this paper, the SCG technique may be a special important generalization conjugate gradient (CG) method, and it is an efficient numerical method for solving nonlinear large scale unconstrained optimization. As a result, we proposed the new SCG method with a strong Wolfe condition (SWC) line search is proposed. The proposed technique's descent property, as well as its global convergence property, are satisfied without the use of any line searches under some suitable assumptions. The proposed technique's efficiency and feasibility are backed up by numerical experiments comparing them to traditional CG techniques.</p>


Author(s):  
Rana Z. Al-Kawaz ◽  
Abbas Y. Al-Bayati

<span>In this article, we give a new modification for the Dai-Liao method to solve monotonous nonlinear problems. In our modification, we relied on two important procedures, one of them was the projection method and the second was the method of damping the quasi-Newton condition. The new approach of derivation yields two new parameters for the conjugated gradient direction which, through some conditions, we have demonstrated the sufficient descent property for them. Under some necessary conditions, the new approach achieved global convergence property. Numerical results show how efficient the new approach is when compared with basic similar classic methods.</span>


2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Zhenhua Su ◽  
Min Li

In this paper, a descent Liu–Storey conjugate gradient method is extended to solve large-scale nonlinear systems of equations. Based on certain assumptions, the global convergence property is obtained with a nonmonotone line search. The proposed method is suitable to solve large-scale problems for the low-storage requirement. Numerical experiment results show that the new method is practically effective.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Yingjie Zhou ◽  
Yulun Wu ◽  
Xiangrong Li

A new hybrid PRPFR conjugate gradient method is presented in this paper, which is designed such that it owns sufficient descent property and trust region property. This method can be considered as a convex combination of the PRP method and the FR method while using the hyperplane projection technique. Under accelerated step length, the global convergence property is gained with some appropriate assumptions. Comparing with other methods, the numerical experiments show that the PRPFR method is more competitive for solving nonlinear equations and image restoration problems.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Junyu Lu ◽  
Yong Li ◽  
Hongtruong Pham

One adaptive choice for the parameter of the Dai–Liao conjugate gradient method is suggested in this paper, which is obtained with modified quasi–Newton equation. So we get a modified Dai–Liao conjugate gradient method. Some interesting features of the proposed method are introduced: (i) The value of parameter t of the modified Dai–Liao conjugate gradient method takes both the gradient and function value information. (ii) We establish the global convergence property of the modified Dai–Liao conjugate gradient method under some suitable assumptions. (iii) Numerical results show that the modified DL method is effective in practical computation and the image restoration problems.


2020 ◽  
Vol 8 (2) ◽  
pp. 403-413
Author(s):  
Yaping Hu ◽  
Liying Liu ◽  
Yujie Wang

This paper presents a Wei-Yao-Liu conjugate gradient algorithm for nonsmooth convex optimization problem. The proposed algorithm makes use of approximate function and gradient values of the Moreau-Yosida regularization function instead of the corresponding exact values.  Under suitable conditions, the global convergence property could be established for the proposed conjugate gradient  method. Finally, some numerical results are reported to show the efficiency of our algorithm.


Author(s):  
Basim Abbas Hassan ◽  
Hussein O. Dahawi ◽  
Azzam S. Younus

<p>In this paper, a replacement new parameter conjugate gradient  for unconstrained optimization. The sufficient descent property cleave to. The global convergence property of the new method is proved under some assumptions. Numerical results explain that the  new parameter is superior  in practice.<strong></strong></p><p><strong> </strong><em></em></p>


Sign in / Sign up

Export Citation Format

Share Document