scholarly journals A Class of Inexact Secant Algorithms with Line Search Filter Method for Nonlinear Programming

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Zhujun Wang ◽  
Li Cai

We propose a class of inexact secant methods in association with the line search filter technique for solving nonlinear equality constrained optimization. Compared with other filter methods that combine the line search method applied in most large-scale optimization problems, the inexact line search filter algorithm is more flexible and realizable. In this paper, we focus on the analysis of the local superlinear convergence rate of the algorithms, while their global convergence properties can be obtained by making an analogy with our previous work. These methods have been implemented in a Matlab code, and detailed numerical results indicate that the proposed algorithms are efficient for 43 problems from the CUTEr test set.

Author(s):  
Jie Guo ◽  
Zhong Wan

A new spectral three-term conjugate gradient algorithm in virtue of the Quasi-Newton equation is developed for solving large-scale unconstrained optimization problems. It is proved that the search directions in this algorithm always satisfy a sufficiently descent condition independent of any line search. Global convergence is established for general objective functions if the strong Wolfe line search is used. Numerical experiments are employed to show its high numerical performance in solving large-scale optimization problems. Particularly, the developed algorithm is implemented to solve the 100 benchmark test problems from CUTE with different sizes from 1000 to 10,000, in comparison with some similar ones in the literature. The numerical results demonstrate that our algorithm outperforms the state-of-the-art ones in terms of less CPU time, less number of iteration or less number of function evaluation.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Xiangrong Li ◽  
Songhua Wang ◽  
Zhongzhou Jin ◽  
Hongtruong Pham

This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: (1) the new search direction possesses not only a sufficient descent property but also a trust region feature; (2) the presented algorithm has global convergence for nonconvex functions; (3) the numerical experiment showed that the new algorithm is more effective than similar algorithms.


Author(s):  
Fanar N. Jardow ◽  
Ghada M. Al-Naemi

Many researchers are interested for developed and improved the conjugate gradient method for solving large scale unconstrained optimization problems. In this work a new parameter  will be presented as a convex combination between RMIL and MMWU. The suggestion method always produces a descent search direction at each iteration. Under Strong Wolfe Powell (SWP) line search conditions, the global convergence of the proposed method is established. The preliminary numerical comparisons with some others CG methods have shown that this new method is efficient and robust in solving all given problems.


Author(s):  
Muna M. M. Ali

The use of the self-scaling Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is very efficient for the resolution of large-scale optimization problems, in this paper, we present a new algorithm and modified the self-scaling BFGS algorithm. Also, based on noticeable non-monotone line search properties, we discovered and employed a new non-monotone idea. Thereafter first, an updated formula is exhorted to the convergent Hessian matrix and we have achieved the secant condition, second, we established the global convergence properties of the algorithm under some mild conditions and the objective function is not convexity hypothesis. A promising behavior is achieved and the numerical results are also reported of the new algorithm.


2018 ◽  
Vol 7 (4.33) ◽  
pp. 521
Author(s):  
Mouiyad Bani Yousef ◽  
Mustafa Mamat ◽  
Mohd Rivaie

The nonlinear conjugate gradient (CG) method is a widely used approach for solving large-scale optimization problems in many fields, such as physics, engineering, economics, and design. The efficiency of this method is mainly attributable to its global convergence properties and low memory requirement. In this paper, a new conjugate gradient coefficient is proposed based on the Aini-Rivaie-Mustafa (ARM) method. Furthermore, the proposed method is proved globally convergent under exact line search. This is supported by the results of the numerical tests. The numerical performance of the new CG method better than other related and more efficient compared with previous CG methods. 


2018 ◽  
Vol 7 (2.14) ◽  
pp. 25 ◽  
Author(s):  
Syazni Shoid ◽  
Norrlaili Shapiee ◽  
Norhaslinda Zull ◽  
Nur Hamizah Abdul Ghani ◽  
Nur Syarafina Mohamed ◽  
...  

Many researchers are intended to improve the conjugate gradient (CG) methods as well as their applications in real life. Besides, CG become more interesting and useful in many disciplines and has important role for solving large-scale optimization problems. In this paper, three types of new CG coefficients are presented with application in estimating data. Numerical experiments show that the proposed methods have succeeded in solving problems under strong Wolfe Powell line search conditions. 


2017 ◽  
Vol 59 ◽  
pp. 340-362 ◽  
Author(s):  
Prabhujit Mohapatra ◽  
Kedar Nath Das ◽  
Santanu Roy

Sign in / Sign up

Export Citation Format

Share Document