Global convergence of new modified CG method with inexact line search

2014 ◽  
Vol 16 (2) ◽  
pp. 17-26
Author(s):  
Latif S. Ivan ◽  
◽  
Mohammed J. Lajan ◽  
2012 ◽  
Vol 2012 ◽  
pp. 1-20 ◽  
Author(s):  
Meiling Liu ◽  
Xueqian Li ◽  
Qinmin Wu

A filter algorithm with inexact line search is proposed for solving nonlinear programming problems. The filter is constructed by employing the norm of the gradient of the Lagrangian function to the infeasibility measure. Transition to superlinear local convergence is showed for the proposed filter algorithm without second-order correction. Under mild conditions, the global convergence can also be derived. Numerical experiments show the efficiency of the algorithm.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Gonglin Yuan ◽  
Zhan Wang ◽  
Pengyuan Li

<p style='text-indent:20px;'>The Broyden family method is one of the most effective methods for solving unconstrained optimization problems. However, the study of the global convergence of the Broyden family method is not sufficient. In this paper, a new Broyden family method is proposed based on the BFGS formula of Yuan and Wei (Comput. Optim. Appl. 47: 237-255, 2010). The following approaches are used in the designed algorithm: (1) a modified Broyden family formula is given, (2) every matrix sequence <inline-formula><tex-math id="M1">\begin{document}$ \{B_k\} $\end{document}</tex-math></inline-formula> generated by the new algorithm possesses positive-definiteness, and (3) the global convergence of the new presented Broyden family algorithm with the Y-W-L inexact line search is obtained for general functions. Numerical performance shows that the modified Broyden family method is competitive with the classical Broyden family method.</p>


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.


Author(s):  
Awad Abdelrahman ◽  
Osman Yousif ◽  
Mogtaba Mhammed ◽  
Murtada Elbashir

Nonlinear conjugate gradient (CG) methods are significant for solving large-scale, unconstrained optimization problems, providing vital knowledge to determine the minimum point or optimize the objective functions. Many studies of modifications for nonlinear CG methods have been carried out to improve the performance of numerical computation and to establish global convergence properties. One of these studies is the modified CG method, which has been proposed by Rivaie et al. (2015). In this paper, we modify their work in such a way that one can obtain efficient numerical performance and global convergence properties. Due to the widespread use of the strong Wolfe line search in practice, our proposed modified method implemented its use. At the same time, to show the performance of the modified method in practice, a numerical experiment is performed. KEYWORDS Unconstrained optimization; conjugate gradient method; sufficient descent property; global convergence


Sign in / Sign up

Export Citation Format

Share Document