scholarly journals Modified limited-memory Broyden-Fletcher-Goldfarb-Shanno algorithm for unconstrained optimization problem

Author(s):  
Muna M. M. Ali

The use of the self-scaling Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is very efficient for the resolution of large-scale optimization problems, in this paper, we present a new algorithm and modified the self-scaling BFGS algorithm. Also, based on noticeable non-monotone line search properties, we discovered and employed a new non-monotone idea. Thereafter first, an updated formula is exhorted to the convergent Hessian matrix and we have achieved the secant condition, second, we established the global convergence properties of the algorithm under some mild conditions and the objective function is not convexity hypothesis. A promising behavior is achieved and the numerical results are also reported of the new algorithm.

Filomat ◽  
2016 ◽  
Vol 30 (5) ◽  
pp. 1283-1296
Author(s):  
Keyvan Amini ◽  
Somayeh Bahrami ◽  
Shadi Amiri

In this paper, a modified BFGS algorithm is proposed to solve unconstrained optimization problems. First, based on a modified secant condition, an update formula is recommended to approximate Hessian matrix. Then thanks to the remarkable nonmonotone line search properties, an appropriate nonmonotone idea is employed. Under some mild conditions, the global convergence properties of the algorithm are established without convexity assumption on the objective function. Preliminary numerical experiments are also reported which indicate the promising behavior of the new algorithm.


Author(s):  
Jie Guo ◽  
Zhong Wan

A new spectral three-term conjugate gradient algorithm in virtue of the Quasi-Newton equation is developed for solving large-scale unconstrained optimization problems. It is proved that the search directions in this algorithm always satisfy a sufficiently descent condition independent of any line search. Global convergence is established for general objective functions if the strong Wolfe line search is used. Numerical experiments are employed to show its high numerical performance in solving large-scale optimization problems. Particularly, the developed algorithm is implemented to solve the 100 benchmark test problems from CUTE with different sizes from 1000 to 10,000, in comparison with some similar ones in the literature. The numerical results demonstrate that our algorithm outperforms the state-of-the-art ones in terms of less CPU time, less number of iteration or less number of function evaluation.


Author(s):  
Martin Buhmann ◽  
Dirk Siegel

Abstract We consider Broyden class updates for large scale optimization problems in n dimensions, restricting attention to the case when the initial second derivative approximation is the identity matrix. Under this assumption we present an implementation of the Broyden class based on a coordinate transformation on each iteration. It requires only $$2nk + O(k^{2}) + O(n)$$ 2 n k + O ( k 2 ) + O ( n ) multiplications on the kth iteration and stores $$nK+ O(K^2) + O(n)$$ n K + O ( K 2 ) + O ( n ) numbers, where K is the total number of iterations. We investigate a modification of this algorithm by a scaling approach and show a substantial improvement in performance over the BFGS method. We also study several adaptations of the new implementation to the limited memory situation, presenting algorithms that work with a fixed amount of storage independent of the number of iterations. We show that one such algorithm retains the property of quadratic termination. The practical performance of the new methods is compared with the performance of Nocedal’s (Math Comput 35:773--782, 1980) method, which is considered the benchmark in limited memory algorithms. The tests show that the new algorithms can be significantly more efficient than Nocedal’s method. Finally, we show how a scaling technique can significantly improve both Nocedal’s method and the new generalized conjugate gradient algorithm.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ahmad Alhawarat ◽  
Thoi Trung Nguyen ◽  
Ramadan Sabra ◽  
Zabidin Salleh

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 54
Author(s):  
Yasir Salih ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Abdelrhaman Abashar ◽  
Mohamad Afendee Mohamed

Conjugate Gradient (CG) method is a very useful technique for solving large-scale nonlinear optimization problems. In this paper, we propose a new formula for 12خ²k"> , which is a hybrid of PRP and WYL methods. This method possesses sufficient descent and global convergence properties when used with exact line search. Numerical results indicate that the new formula has higher efficiency compared with other classical CG methods. 


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Zhujun Wang ◽  
Li Cai

We propose a class of inexact secant methods in association with the line search filter technique for solving nonlinear equality constrained optimization. Compared with other filter methods that combine the line search method applied in most large-scale optimization problems, the inexact line search filter algorithm is more flexible and realizable. In this paper, we focus on the analysis of the local superlinear convergence rate of the algorithms, while their global convergence properties can be obtained by making an analogy with our previous work. These methods have been implemented in a Matlab code, and detailed numerical results indicate that the proposed algorithms are efficient for 43 problems from the CUTEr test set.


2018 ◽  
Vol 7 (2.14) ◽  
pp. 21
Author(s):  
Omar Alshorman ◽  
Mustafa Mamat ◽  
Ahmad Alhawarat ◽  
Mohd Revaie

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.  


2018 ◽  
Vol 7 (4.33) ◽  
pp. 521
Author(s):  
Mouiyad Bani Yousef ◽  
Mustafa Mamat ◽  
Mohd Rivaie

The nonlinear conjugate gradient (CG) method is a widely used approach for solving large-scale optimization problems in many fields, such as physics, engineering, economics, and design. The efficiency of this method is mainly attributable to its global convergence properties and low memory requirement. In this paper, a new conjugate gradient coefficient is proposed based on the Aini-Rivaie-Mustafa (ARM) method. Furthermore, the proposed method is proved globally convergent under exact line search. This is supported by the results of the numerical tests. The numerical performance of the new CG method better than other related and more efficient compared with previous CG methods. 


2018 ◽  
Vol 7 (2.14) ◽  
pp. 25 ◽  
Author(s):  
Syazni Shoid ◽  
Norrlaili Shapiee ◽  
Norhaslinda Zull ◽  
Nur Hamizah Abdul Ghani ◽  
Nur Syarafina Mohamed ◽  
...  

Many researchers are intended to improve the conjugate gradient (CG) methods as well as their applications in real life. Besides, CG become more interesting and useful in many disciplines and has important role for solving large-scale optimization problems. In this paper, three types of new CG coefficients are presented with application in estimating data. Numerical experiments show that the proposed methods have succeeded in solving problems under strong Wolfe Powell line search conditions. 


Sign in / Sign up

Export Citation Format

Share Document