scholarly journals Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization

2009 ◽  
Vol 30 (2) ◽  
pp. 387-396 ◽  
Author(s):  
Wah June Leong ◽  
Malik Abu Hassan
2018 ◽  
Vol 7 (3.28) ◽  
pp. 72
Author(s):  
Siti Farhana Husin ◽  
Mustafa Mamat ◽  
Mohd Asrul Hery Ibrahim ◽  
Mohd Rivaie

In this paper, we develop a new search direction for Steepest Descent (SD) method by replacing previous search direction from Conjugate Gradient (CG) method, , with gradient from the previous step,  for solving large-scale optimization problem. We also used one of the conjugate coefficient as a coefficient for matrix . Under some reasonable assumptions, we prove that the proposed method with exact line search satisfies descent property and possesses the globally convergent. Further, the numerical results on some unconstrained optimization problem show that the proposed algorithm is promising. 


2010 ◽  
Vol 2010 ◽  
pp. 1-9 ◽  
Author(s):  
Ming-Liang Zhang ◽  
Yun-Hai Xiao ◽  
Dangzhen Zhou

We develop a sufficient descent method for solving large-scale unconstrained optimization problems. At each iteration, the search direction is a linear combination of the gradient at the current and the previous steps. An attractive property of this method is that the generated directions are always descent. Under some appropriate conditions, we show that the proposed method converges globally. Numerical experiments on some unconstrained minimization problems from CUTEr library are reported, which illustrate that the proposed method is promising.


Sign in / Sign up

Export Citation Format

Share Document