scholarly journals A family of global convergent inexact secant methods for nonconvex constrained optimization

2018 ◽  
Vol 12 (2) ◽  
pp. 165-176
Author(s):  
Zhujun Wang ◽  
Li Cai ◽  
Zheng Peng

We present a family of new inexact secant methods in association with Armijo line search technique for solving nonconvex constrained optimization. Different from the existing inexact secant methods, the algorithms proposed in this paper need not compute exact directions. By adopting the nonsmooth exact penalty function as the merit function, the global convergence of the proposed algorithms is established under some reasonable conditions. Some numerical results indicate that the proposed algorithms are both feasible and effective.

2019 ◽  
Vol 36 (04) ◽  
pp. 1950017 ◽  
Author(s):  
Wen-Li Dong ◽  
Xing Li ◽  
Zheng Peng

In this paper, we propose a simulated annealing-based Barzilai–Borwein (SABB) gradient method for unconstrained optimization problems. The SABB method accepts the Barzilai–Borwein (BB) step by a simulated annealing rule. If the BB step cannot be accepted, the Armijo line search is used. The global convergence of the SABB method is established under some mild conditions. Numerical experiments indicate that, compared to some existing BB methods using nonmonotone line search technique, the SABB method performs well with high efficiency.


2019 ◽  
Vol 53 (3) ◽  
pp. 787-805
Author(s):  
Lijuan Zhao

In this paper, we propose a nonmonotone trust region method for bound constrained optimization problems, where the bounds are dealt with by affine scaling technique. Differing from the traditional trust region methods, the subproblem in our algorithm is based on a conic model. Moreover, when the trial point isn’t acceptable by the usual trust region criterion, a line search technique is used to find an acceptable point. This procedure avoids resolving the trust region subproblem, which may reduce the total computational cost. The global convergence and Q-superlinear convergence of the algorithm are established under some mild conditions. Numerical results on a series of standard test problems are reported to show the effectiveness of the new method.


2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Masoud Hatamian ◽  
Mahmoud Paripour ◽  
Farajollah Mohammadi Yaghoobi ◽  
Nasrin Karamikabir

In this article, a new nonmonotone line search technique is proposed for solving a system of nonlinear equations. We attempt to answer this question how to control the degree of the nonmonotonicity of line search rules in order to reach a more efficient algorithm? Therefore, we present a novel algorithm that can avoid the increase of unsuccessful iterations. For this purpose, we show the robust behavior of the proposed algorithm by solving a few numerical examples. Under some suitable assumptions, the global convergence of our strategy is proved.


Author(s):  
Mohammed Belloufi ◽  
Rachid Benzine ◽  
Laskri Yamina

The Hestenes-Stiefel (HS) conjugate gradient algorithm is a useful tool of unconstrainednumerical optimization, which has good numerical performance but no global convergence result under traditional line searches. This paper proposes a line search technique that guarantee the globalconvergence of the Hestenes-Stiefel (HS) conjugate gradient method. Numerical tests are presented tovalidate the different approaches.


2015 ◽  
Vol 2015 ◽  
pp. 1-8
Author(s):  
Yunlong Lu ◽  
Weiwei Yang ◽  
Wenyu Li ◽  
Xiaowei Jiang ◽  
Yueting Yang

A new trust region method is presented, which combines nonmonotone line search technique, a self-adaptive update rule for the trust region radius, and the weighting technique for the ratio between the actual reduction and the predicted reduction. Under reasonable assumptions, the global convergence of the method is established for unconstrained nonconvex optimization. Numerical results show that the new method is efficient and robust for solving unconstrained optimization problems.


Sign in / Sign up

Export Citation Format

Share Document