scholarly journals A nonmonotone modified BFGS algorithm for nonconvex unconstrained optimization problems

Filomat ◽  
2016 ◽  
Vol 30 (5) ◽  
pp. 1283-1296
Author(s):  
Keyvan Amini ◽  
Somayeh Bahrami ◽  
Shadi Amiri

In this paper, a modified BFGS algorithm is proposed to solve unconstrained optimization problems. First, based on a modified secant condition, an update formula is recommended to approximate Hessian matrix. Then thanks to the remarkable nonmonotone line search properties, an appropriate nonmonotone idea is employed. Under some mild conditions, the global convergence properties of the algorithm are established without convexity assumption on the objective function. Preliminary numerical experiments are also reported which indicate the promising behavior of the new algorithm.

Symmetry ◽  
2020 ◽  
Vol 12 (4) ◽  
pp. 656
Author(s):  
Quan Qu ◽  
Xianfeng Ding ◽  
Xinyi Wang

In this paper, a new nonmonotone adaptive trust region algorithm is proposed for unconstrained optimization by combining a multidimensional filter and the Goldstein-type line search technique. A modified trust region ratio is presented which results in more reasonable consistency between the accurate model and the approximate model. When a trial step is rejected, we use a multidimensional filter to increase the likelihood that the trial step is accepted. If the trial step is still not successful with the filter, a nonmonotone Goldstein-type line search is used in the direction of the rejected trial step. The approximation of the Hessian matrix is updated by the modified Quasi-Newton formula (CBFGS). Under appropriate conditions, the proposed algorithm is globally convergent and superlinearly convergent. The new algorithm shows better performance in terms of the Dolan–Moré performance profile. Numerical results demonstrate the efficiency and robustness of the proposed algorithm for solving unconstrained optimization problems.


2018 ◽  
Vol 7 (2.14) ◽  
pp. 21
Author(s):  
Omar Alshorman ◽  
Mustafa Mamat ◽  
Ahmad Alhawarat ◽  
Mohd Revaie

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.  


Author(s):  
Muna M. M. Ali

The use of the self-scaling Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is very efficient for the resolution of large-scale optimization problems, in this paper, we present a new algorithm and modified the self-scaling BFGS algorithm. Also, based on noticeable non-monotone line search properties, we discovered and employed a new non-monotone idea. Thereafter first, an updated formula is exhorted to the convergent Hessian matrix and we have achieved the secant condition, second, we established the global convergence properties of the algorithm under some mild conditions and the objective function is not convexity hypothesis. A promising behavior is achieved and the numerical results are also reported of the new algorithm.


2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Pengyuan Li ◽  
Zhan Wang ◽  
Dan Luo ◽  
Hongtruong Pham

The BFGS method is one of the most efficient quasi-Newton methods for solving small- and medium-size unconstrained optimization problems. For the sake of exploring its more interesting properties, a modified two-parameter scaled BFGS method is stated in this paper. The intention of the modified scaled BFGS method is to improve the eigenvalues structure of the BFGS update. In this method, the first two terms and the last term of the standard BFGS update formula are scaled with two different positive parameters, and the new value of yk is given. Meanwhile, Yuan-Wei-Lu line search is also proposed. Under the mentioned line search, the modified two-parameter scaled BFGS method is globally convergent for nonconvex functions. The extensive numerical experiments show that this form of the scaled BFGS method outperforms the standard BFGS method or some similar scaled methods.


2019 ◽  
Vol 36 (04) ◽  
pp. 1950017 ◽  
Author(s):  
Wen-Li Dong ◽  
Xing Li ◽  
Zheng Peng

In this paper, we propose a simulated annealing-based Barzilai–Borwein (SABB) gradient method for unconstrained optimization problems. The SABB method accepts the Barzilai–Borwein (BB) step by a simulated annealing rule. If the BB step cannot be accepted, the Armijo line search is used. The global convergence of the SABB method is established under some mild conditions. Numerical experiments indicate that, compared to some existing BB methods using nonmonotone line search technique, the SABB method performs well with high efficiency.


2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Yunlong Lu ◽  
Wenyu Li ◽  
Mingyuan Cao ◽  
Yueting Yang

A new self-adaptive rule of trust region radius is introduced, which is given by a piecewise function on the ratio between the actual and predicted reductions of the objective function. A self-adaptive trust region method for unconstrained optimization problems is presented. The convergence properties of the method are established under reasonable assumptions. Preliminary numerical results show that the new method is significant and robust for solving unconstrained optimization problems.


2014 ◽  
Vol 556-562 ◽  
pp. 4023-4026
Author(s):  
Ting Feng Li ◽  
Zhi Yuan Liu ◽  
Sheng Hui Yan

In this paper, a modification BFGS method with nonmonotone line-search for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. Some numerical results are reported which illustrate that the proposed method is efficient


Sign in / Sign up

Export Citation Format

Share Document