scholarly journals A Regularized Newton Method with Correction for Unconstrained Nonconvex Optimization

2015 ◽  
Vol 7 (2) ◽  
pp. 7 ◽  
Author(s):  
Heng Wang ◽  
Mei Qin

In this paper, we present a modified regularized Newton method for minimizing a nonconvex function whose Hessian matrix may be singular. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the method has a global convergence property. Under the local error bound condition which is weaker than nonsingularity, the method has cubic convergence.

2019 ◽  
Vol 2019 ◽  
pp. 1-6 ◽  
Author(s):  
Eman T. Hamed ◽  
Huda I. Ahmed ◽  
Abbas Y. Al-Bayati

In this study, we tend to propose a replacement hybrid algorithmic rule which mixes the search directions like Steepest Descent (SD) and Quasi-Newton (QN). First, we tend to develop a replacement search direction for combined conjugate gradient (CG) and QN strategies. Second, we tend to depict a replacement positive CG methodology that possesses the adequate descent property with sturdy Wolfe line search. We tend to conjointly prove a replacement theorem to make sure global convergence property is underneath some given conditions. Our numerical results show that the new algorithmic rule is powerful as compared to different standard high scale CG strategies.


2013 ◽  
Vol 2013 ◽  
pp. 1-8
Author(s):  
Lian Zheng

We propose a class of new double projection algorithms for solving variational inequality problem, which can be viewed as a framework of the method of Solodov and Svaiter by adopting a class of new hyperplanes. By the separation property of hyperplane, our method is proved to be globally convergent under very mild assumptions. In addition, we propose a modified version of our algorithm that finds a solution of variational inequality which is also a fixed point of a given nonexpansive mapping. If, in addition, a certain local error bound holds, we analyze the convergence rate of the iterative sequence. Numerical experiments prove that our algorithms are efficient.


2014 ◽  
Vol 556-562 ◽  
pp. 4023-4026
Author(s):  
Ting Feng Li ◽  
Zhi Yuan Liu ◽  
Sheng Hui Yan

In this paper, a modification BFGS method with nonmonotone line-search for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. Some numerical results are reported which illustrate that the proposed method is efficient


Sign in / Sign up

Export Citation Format

Share Document