scholarly journals A Truncated Descent HS Conjugate Gradient Method and Its Global Convergence

2009 ◽  
Vol 2009 ◽  
pp. 1-13
Author(s):  
Wanyou Cheng ◽  
Zongguo Zhang

Recently, Zhang (2006) proposed a three-term modified HS (TTHS) method for unconstrained optimization problems. An attractive property of the TTHS method is that the direction generated by the method is always descent. This property is independent of the line search used. In order to obtain the global convergence of the TTHS method, Zhang proposed a truncated TTHS method. A drawback is that the numerical performance of the truncated TTHS method is not ideal. In this paper, we prove that the TTHS method with standard Armijo line search is globally convergent for uniformly convex problems. Moreover, we propose a new truncated TTHS method. Under suitable conditions, global convergence is obtained for the proposed method. Extensive numerical experiment show that the proposed method is very efficient for the test problems from the CUTE Library.

2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Gonglin Yuan ◽  
Zhan Wang ◽  
Pengyuan Li

<p style='text-indent:20px;'>The Broyden family method is one of the most effective methods for solving unconstrained optimization problems. However, the study of the global convergence of the Broyden family method is not sufficient. In this paper, a new Broyden family method is proposed based on the BFGS formula of Yuan and Wei (Comput. Optim. Appl. 47: 237-255, 2010). The following approaches are used in the designed algorithm: (1) a modified Broyden family formula is given, (2) every matrix sequence <inline-formula><tex-math id="M1">\begin{document}$ \{B_k\} $\end{document}</tex-math></inline-formula> generated by the new algorithm possesses positive-definiteness, and (3) the global convergence of the new presented Broyden family algorithm with the Y-W-L inexact line search is obtained for general functions. Numerical performance shows that the modified Broyden family method is competitive with the classical Broyden family method.</p>


2012 ◽  
Vol 2012 ◽  
pp. 1-9
Author(s):  
Liu JinKui ◽  
Du Xianglin

The LS method is one of the effective conjugate gradient methods in solving the unconstrained optimization problems. The paper presents a modified LS method on the basis of the famous LS method and proves the strong global convergence for the uniformly convex functions and the global convergence for general functions under the strong Wolfe line search. The numerical experiments show that the modified LS method is very effective in practice.


Symmetry ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 970
Author(s):  
Jie Guo ◽  
Zhong Wan

In this paper, we develop two algorithms to solve a nonlinear system of symmetric equations. The first is an algorithm based on modifying two Broyden–Fletcher–Goldfarb–Shanno (BFGS) methods. One of its advantages is that it is more suitable to effectively solve a small-scale system of nonlinear symmetric equations. In contrast, the second algorithm chooses new search directions by incorporating an approximation method of computing the gradients and their difference into the determination of search directions in the first algorithm. In essence, the second one can be viewed as an extension of the conjugate gradient method recently proposed by Lv et al. for solving unconstrained optimization problems. It was proved that these search directions are sufficiently descending for the approximate residual square of the equations, independent of the used line search rules. Global convergence of the two algorithms is established under mild assumptions. To test the algorithms, they are used to solve a number of benchmark test problems. Numerical results indicate that the developed algorithms in this paper outperform the other similar algorithms available in the literature.


2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


Author(s):  
Saman Babaie-Kafaki ◽  
Saeed Rezaee

Hybridizing the trust region, line search and simulated annealing methods, we develop a heuristic algorithm for solving unconstrained optimization problems. We make some numerical experiments on a set of CUTEr test problems to investigate efficiency of the suggested algorithm. The results show that the algorithm is practically promising.


2019 ◽  
Vol 36 (04) ◽  
pp. 1950017 ◽  
Author(s):  
Wen-Li Dong ◽  
Xing Li ◽  
Zheng Peng

In this paper, we propose a simulated annealing-based Barzilai–Borwein (SABB) gradient method for unconstrained optimization problems. The SABB method accepts the Barzilai–Borwein (BB) step by a simulated annealing rule. If the BB step cannot be accepted, the Armijo line search is used. The global convergence of the SABB method is established under some mild conditions. Numerical experiments indicate that, compared to some existing BB methods using nonmonotone line search technique, the SABB method performs well with high efficiency.


Author(s):  
Amira Hamdi ◽  
Badreddine Sellami ◽  
Mohammed Belloufi

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems, the conjugate gradient parameter [Formula: see text] is computed as a convex combination of [Formula: see text] and [Formula: see text]. Under the wolfe line search, we prove the sufficient descent and the global convergence. Numerical results are reported to show the effectiveness of our procedure.


2020 ◽  
Vol 2020 (1) ◽  
Author(s):  
Shashi Kant Mishra ◽  
Geetanjali Panda ◽  
Suvra Kanti Chakraborty ◽  
Mohammad Esmael Samei ◽  
Bhagwat Ram

AbstractVariants of the Newton method are very popular for solving unconstrained optimization problems. The study on global convergence of the BFGS method has also made good progress. The q-gradient reduces to its classical version when q approaches 1. In this paper, we propose a quantum-Broyden–Fletcher–Goldfarb–Shanno algorithm where the Hessian is constructed using the q-gradient and descent direction is found at each iteration. The algorithm presented in this paper is implemented by applying the independent parameter q in the Armijo–Wolfe conditions to compute the step length which guarantees that the objective function value decreases. The global convergence is established without the convexity assumption on the objective function. Further, the proposed method is verified by the numerical test problems and the results are depicted through the performance profiles.


Sign in / Sign up

Export Citation Format

Share Document