A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization

2020 ◽  
Vol 185 (1) ◽  
pp. 223-238
Author(s):  
Tsegay Giday Woldu ◽  
Haibin Zhang ◽  
Xin Zhang ◽  
Yemane Hailu Fissuh
2020 ◽  
Vol 8 (2) ◽  
pp. 403-413
Author(s):  
Yaping Hu ◽  
Liying Liu ◽  
Yujie Wang

This paper presents a Wei-Yao-Liu conjugate gradient algorithm for nonsmooth convex optimization problem. The proposed algorithm makes use of approximate function and gradient values of the Moreau-Yosida regularization function instead of the corresponding exact values.  Under suitable conditions, the global convergence property could be established for the proposed conjugate gradient  method. Finally, some numerical results are reported to show the efficiency of our algorithm.


Author(s):  
Gonglin Yuan ◽  
Tingting Li ◽  
Wujie Hu

Abstract To solve large-scale unconstrained optimization problems, a modified PRP conjugate gradient algorithm is proposed and is found to be interesting because it combines the steepest descent algorithm with the conjugate gradient method and successfully fully utilizes their excellent properties. For smooth functions, the objective algorithm sufficiently utilizes information about the gradient function and the previous direction to determine the next search direction. For nonsmooth functions, a Moreau–Yosida regularization is introduced into the proposed algorithm, which simplifies the process in addressing complex problems. The proposed algorithm has the following characteristics: (i) a sufficient descent feature as well as a trust region trait; (ii) the ability to achieve global convergence; (iii) numerical results for large-scale smooth/nonsmooth functions prove that the proposed algorithm is outstanding compared to other similar optimization methods; (iv) image restoration problems are done to turn out that the given algorithm is successful.


Sign in / Sign up

Export Citation Format

Share Document