Monotone and nonmonotone trust-region-based algorithms for large scale unconstrained optimization problems

2012 ◽  
Vol 54 (1) ◽  
pp. 27-43 ◽  
Author(s):  
María C. Maciel ◽  
María G. Mendonça ◽  
Adriana B. Verdiell
2014 ◽  
Vol 19 (4) ◽  
pp. 469-490 ◽  
Author(s):  
Hamid Esmaeili ◽  
Morteza Kimiaei

In this study, we propose a trust-region-based procedure to solve unconstrained optimization problems that take advantage of the nonmonotone technique to introduce an efficient adaptive radius strategy. In our approach, the adaptive technique leads to decreasing the total number of iterations, while utilizing the structure of nonmonotone formula helps us to handle large-scale problems. The new algorithm preserves the global convergence and has quadratic convergence under suitable conditions. Preliminary numerical experiments on standard test problems indicate the efficiency and robustness of the proposed approach for solving unconstrained optimization problems.


Algorithms ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 227
Author(s):  
Zabidin Salleh ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Ahmad Alhawarat

The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require the second derivative, such as Newton’s method or approximations. Moreover, the conjugate gradient method can be applied in many fields such as neural networks, image restoration, etc. Many complicated methods are proposed to solve these optimization functions in two or three terms. In this paper, we propose a simple, easy, efficient, and robust conjugate gradient method. The new method is constructed based on the Liu and Storey method to overcome the convergence problem and descent property. The new modified method satisfies the convergence properties and the sufficient descent condition under some assumptions. The numerical results show that the new method outperforms famous CG methods such as CG-Descent5.3, Liu and Storey, and Dai and Liao. The numerical results include the number of iterations and CPU time.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ahmad Alhawarat ◽  
Thoi Trung Nguyen ◽  
Ramadan Sabra ◽  
Zabidin Salleh

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.


2010 ◽  
Vol 2010 ◽  
pp. 1-9 ◽  
Author(s):  
Ming-Liang Zhang ◽  
Yun-Hai Xiao ◽  
Dangzhen Zhou

We develop a sufficient descent method for solving large-scale unconstrained optimization problems. At each iteration, the search direction is a linear combination of the gradient at the current and the previous steps. An attractive property of this method is that the generated directions are always descent. Under some appropriate conditions, we show that the proposed method converges globally. Numerical experiments on some unconstrained minimization problems from CUTEr library are reported, which illustrate that the proposed method is promising.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


Sign in / Sign up

Export Citation Format

Share Document