secant equation
Recently Published Documents


TOTAL DOCUMENTS

28
(FIVE YEARS 13)

H-INDEX

7
(FIVE YEARS 1)

Author(s):  
N. Boutet ◽  
◽  
R. Haelterman ◽  
J. Degroote

Working with Quasi-Newton methods in optimization leads to one important challenge, being to find an estimate of the Hessian matrix as close as possible to the real matrix. While multisecant methods are regularly used to solve root finding problems, they have been little explored in optimization because the symmetry property of the Hessian matrix estimation is generally not compatible with the multisecant property. In this paper, we propose a solution to apply multisecant methods to optimization problems. Starting from the Powell-Symmetric-Broyden (PSB) update formula and adding pieces of information from the previous steps of the optimization path, we want to develop a new update formula for the estimate of the Hessian. A multisecant version of PSB is, however, generally mathematically impossible to build. For that reason, we provide a formula that satisfies the symmetry and is as close as possible to satisfy the multisecant condition and vice versa for a second formula. Subsequently, we add enforcement of the last secant equation to the symmetric formula and present a comparison between the different methods.


Author(s):  
Yutao Zheng

In this paper, a new family of Dai-Liao--type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl., 28: 203--225, 2004] is considered in Dai and Liao's conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.


Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1353
Author(s):  
Qi Tian ◽  
Xiaoliang Wang ◽  
Liping Pang ◽  
Mingkun Zhang ◽  
Fanyun Meng

Three-term conjugate gradient methods have attracted much attention for large-scale unconstrained problems in recent years, since they have attractive practical factors such as simple computation, low memory requirement, better descent property and strong global convergence property. In this paper, a hybrid three-term conjugate gradient algorithm is proposed and it owns a sufficient descent property, independent of any line search technique. Under some mild conditions, the proposed method is globally convergent for uniformly convex objective functions. Meanwhile, by using the modified secant equation, the proposed method is also global convergence without convexity assumption on the objective function. Numerical results also indicate that the proposed algorithm is more efficient and reliable than the other methods for the testing problems.


Two-step methods are secant-like techniques of the quasi-Newton type that, unlike the classical methods, construct nonlinear alternatives to the quantities used in the so-called Secant equation. Two-step methods instead incorporate data available from the two most recent iterations and thus create an alternative to the Secant equation with the intention of creating better Hessian approximations that induce faster convergence to the minimizer of the objective function. Such methods, based on reported numerical results published in several research papers related to the subject, have introduced substantial savings in both iteration and function evaluation counts. Encouraged by the successful performance of the methods, we explore in this paper employing them in developing a new Conjugate Gradient (CG) algorithm. CG methods gain popularity on big problems and in situations when memory resources are scarce. The numerical experimentations on the new methods are encouraging and open venue for further investigation of such techniques to explore their merits in a multitude of applications.


Sign in / Sign up

Export Citation Format

Share Document