bfgs method
Recently Published Documents


TOTAL DOCUMENTS

134
(FIVE YEARS 31)

H-INDEX

17
(FIVE YEARS 2)

Symmetry ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 2093
Author(s):  
Huiping Cao ◽  
Xiaomin An

In our paper, we introduce a sparse and symmetric matrix completion quasi-Newton model using automatic differentiation, for solving unconstrained optimization problems where the sparse structure of the Hessian is available. The proposed method is a kind of matrix completion quasi-Newton method and has some nice properties. Moreover, the presented method keeps the sparsity of the Hessian exactly and satisfies the quasi-Newton equation approximately. Under the usual assumptions, local and superlinear convergence are established. We tested the performance of the method, showing that the new method is effective and superior to matrix completion quasi-Newton updating with the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method and the limited-memory BFGS method.


Author(s):  
Ali Hakan Tor

The aim of this study is to compare the performance of smooth and nonsmooth optimization solvers from HANSO (Hybrid Algorithm for Nonsmooth Optimization) software. The smooth optimization solver is the implementation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and the nonsmooth optimization solver is the Hybrid Algorithm for Nonsmooth Optimization. More precisely, the nonsmooth optimization algorithm is the combination of the BFGS and the Gradient Sampling Algorithm (GSA). We use well-known collection of academic test problems for nonsmooth optimization containing both convex and nonconvex problems. The motivation for this research is the importance of the comparative assessment of smooth optimization methods for solving nonsmooth optimization problems. This assessment will demonstrate how successful is the BFGS method for solving nonsmooth optimization problems in comparison with the nonsmooth optimization solver from HANSO. Performance profiles using the number iterations, the number of function evaluations and the number of subgradient evaluations are used to compare solvers.


2021 ◽  
pp. 107634
Author(s):  
Gonglin Yuan ◽  
Mengxiang Zhang ◽  
Yingjie Zhou

2021 ◽  
Author(s):  
Qiong‐Ying Chen ◽  
Yun‐Zhi Huang ◽  
Min Gan ◽  
C. L. Philip Chen ◽  
Guang‐Yong Chen

Author(s):  
Hamsa Th. Saeed Chilmeran ◽  
Huda I. Ahmed ◽  
Eman T. Hamed ◽  
Abbas Y. Al-Bayati

<p class="MsoNormal" style="text-align: justify;"><span>In this work we propose and analyze a hybrid conjugate gradient (CG) method in which the parameter <!--[if gte mso 9]><xml> <o:OLEObject Type="Embed" ProgID="Equation.3" ShapeID="_x0000_i1025" DrawAspect="Content" ObjectID="_1674222415"> </o:OLEObject> </xml><![endif]-->is computed as a linear combination between Hager-Zhang [HZ] and Dai-Liao [DL] parameters. We use this proposed method to modify BFGS method and to prove the positive definiteness and QN-conditions of the matrix. Theoretical trils confirm that the new search directions aredescent directions under some conditions, as well as, the new search directions areglobally convergent using strong Wolfe conditions. The numerical experiments show that the proposed method is promising and outperforms alternative similar CG-methods using Dolan-Mor'e performance profile. </span><br /><br /></p>


2020 ◽  
Vol 3 (0) ◽  
pp. 102-106
Author(s):  
Petro Stetsyuk ◽  
Volodymyr Lyashko ◽  
Anton Suprun
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document