A MODIFIED BFGS METHOD VIA NEW RATIONAL APPROXIMATION MODEL FOR SOLVING UNCONSTRAINED OPTIMIZATION PROBLEMS AND ITS APPLICATION

2020 ◽  
Vol 9 (12) ◽  
pp. 10771-10786
Author(s):  
K. Kamfa ◽  
S. Ibrahim ◽  
S. F. Sufahani ◽  
R. B. Yunus ◽  
M. Mamat
2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Pengyuan Li ◽  
Zhan Wang ◽  
Dan Luo ◽  
Hongtruong Pham

The BFGS method is one of the most efficient quasi-Newton methods for solving small- and medium-size unconstrained optimization problems. For the sake of exploring its more interesting properties, a modified two-parameter scaled BFGS method is stated in this paper. The intention of the modified scaled BFGS method is to improve the eigenvalues structure of the BFGS update. In this method, the first two terms and the last term of the standard BFGS update formula are scaled with two different positive parameters, and the new value of yk is given. Meanwhile, Yuan-Wei-Lu line search is also proposed. Under the mentioned line search, the modified two-parameter scaled BFGS method is globally convergent for nonconvex functions. The extensive numerical experiments show that this form of the scaled BFGS method outperforms the standard BFGS method or some similar scaled methods.


2014 ◽  
Author(s):  
Mohd Asrul Hery Bin Ibrahim ◽  
Mustafa Mamat ◽  
Leong Wah June ◽  
Azfi Zaidi Mohammad Sofi

Symmetry ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 2093
Author(s):  
Huiping Cao ◽  
Xiaomin An

In our paper, we introduce a sparse and symmetric matrix completion quasi-Newton model using automatic differentiation, for solving unconstrained optimization problems where the sparse structure of the Hessian is available. The proposed method is a kind of matrix completion quasi-Newton method and has some nice properties. Moreover, the presented method keeps the sparsity of the Hessian exactly and satisfies the quasi-Newton equation approximately. Under the usual assumptions, local and superlinear convergence are established. We tested the performance of the method, showing that the new method is effective and superior to matrix completion quasi-Newton updating with the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method and the limited-memory BFGS method.


2020 ◽  
Vol 2020 (1) ◽  
Author(s):  
Shashi Kant Mishra ◽  
Geetanjali Panda ◽  
Suvra Kanti Chakraborty ◽  
Mohammad Esmael Samei ◽  
Bhagwat Ram

AbstractVariants of the Newton method are very popular for solving unconstrained optimization problems. The study on global convergence of the BFGS method has also made good progress. The q-gradient reduces to its classical version when q approaches 1. In this paper, we propose a quantum-Broyden–Fletcher–Goldfarb–Shanno algorithm where the Hessian is constructed using the q-gradient and descent direction is found at each iteration. The algorithm presented in this paper is implemented by applying the independent parameter q in the Armijo–Wolfe conditions to compute the step length which guarantees that the objective function value decreases. The global convergence is established without the convexity assumption on the objective function. Further, the proposed method is verified by the numerical test problems and the results are depicted through the performance profiles.


2018 ◽  
Vol 37 (4) ◽  
pp. 5113-5125 ◽  
Author(s):  
Razieh Dehghani ◽  
Narges Bidabadi ◽  
Mohammad Mehdi Hosseini

2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


Sign in / Sign up

Export Citation Format

Share Document