scholarly journals Global convergence of a modified Broyden family method for nonconvex functions

2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Gonglin Yuan ◽  
Zhan Wang ◽  
Pengyuan Li

<p style='text-indent:20px;'>The Broyden family method is one of the most effective methods for solving unconstrained optimization problems. However, the study of the global convergence of the Broyden family method is not sufficient. In this paper, a new Broyden family method is proposed based on the BFGS formula of Yuan and Wei (Comput. Optim. Appl. 47: 237-255, 2010). The following approaches are used in the designed algorithm: (1) a modified Broyden family formula is given, (2) every matrix sequence <inline-formula><tex-math id="M1">\begin{document}$ \{B_k\} $\end{document}</tex-math></inline-formula> generated by the new algorithm possesses positive-definiteness, and (3) the global convergence of the new presented Broyden family algorithm with the Y-W-L inexact line search is obtained for general functions. Numerical performance shows that the modified Broyden family method is competitive with the classical Broyden family method.</p>

2009 ◽  
Vol 2009 ◽  
pp. 1-13
Author(s):  
Wanyou Cheng ◽  
Zongguo Zhang

Recently, Zhang (2006) proposed a three-term modified HS (TTHS) method for unconstrained optimization problems. An attractive property of the TTHS method is that the direction generated by the method is always descent. This property is independent of the line search used. In order to obtain the global convergence of the TTHS method, Zhang proposed a truncated TTHS method. A drawback is that the numerical performance of the truncated TTHS method is not ideal. In this paper, we prove that the TTHS method with standard Armijo line search is globally convergent for uniformly convex problems. Moreover, we propose a new truncated TTHS method. Under suitable conditions, global convergence is obtained for the proposed method. Extensive numerical experiment show that the proposed method is very efficient for the test problems from the CUTE Library.


2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Pengyuan Li ◽  
Zhan Wang ◽  
Dan Luo ◽  
Hongtruong Pham

The BFGS method is one of the most efficient quasi-Newton methods for solving small- and medium-size unconstrained optimization problems. For the sake of exploring its more interesting properties, a modified two-parameter scaled BFGS method is stated in this paper. The intention of the modified scaled BFGS method is to improve the eigenvalues structure of the BFGS update. In this method, the first two terms and the last term of the standard BFGS update formula are scaled with two different positive parameters, and the new value of yk is given. Meanwhile, Yuan-Wei-Lu line search is also proposed. Under the mentioned line search, the modified two-parameter scaled BFGS method is globally convergent for nonconvex functions. The extensive numerical experiments show that this form of the scaled BFGS method outperforms the standard BFGS method or some similar scaled methods.


Author(s):  
Amira Hamdi ◽  
Badreddine Sellami ◽  
Mohammed Belloufi

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems, the conjugate gradient parameter [Formula: see text] is computed as a convex combination of [Formula: see text] and [Formula: see text]. Under the wolfe line search, we prove the sufficient descent and the global convergence. Numerical results are reported to show the effectiveness of our procedure.


2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
U. A. M. Roslan

A new modified three-term conjugate gradient (CG) method is shown for solving the large scale optimization problems. The idea relates to the famous Polak-Ribière-Polyak (PRP) formula. As the numerator of PRP plays a vital role in numerical result and not having the jamming issue, PRP method is not globally convergent. So, for the new three-term CG method, the idea is to use the PRP numerator and combine it with any good CG formula’s denominator that performs well. The new modification of three-term CG method possesses the sufficient descent condition independent of any line search. The novelty is that by using the Wolfe Powell line search the new modification possesses global convergence properties with convex and nonconvex functions. Numerical computation with the Wolfe Powell line search by using the standard test function of optimization shows the efficiency and robustness of the new modification.


2019 ◽  
Vol 38 (7) ◽  
pp. 227-231
Author(s):  
Huda Younus Najm ◽  
Eman T. Hamed ◽  
Huda I. Ahmed

In this study, we propose a new parameter in the conjugate gradient method. It is shown that the new method fulfils the sufficient descent condition with the strong Wolfe condition when inexact line search has been used. The numerical results of this suggested method also shown that this method outperforms to other standard conjugate gradient method.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 84 ◽  
Author(s):  
Nurul Aini ◽  
Nurul Hajar ◽  
Mohd Rivaie ◽  
Mustafa Mamat

The conjugate gradient (CG) method is a well-known solver for large-scale unconstrained optimization problems. In this paper, a modified CG method based on AMR* and CD method is presented. The resulting algorithm for the new CG method is proved to be globally convergent under exact line search both under some mild conditions. Comparisons of numerical performance are made involving the new method and four other CG methods. The results show that the proposed method is more efficient.  


Sign in / Sign up

Export Citation Format

Share Document