scholarly journals Large-scale unconstrained optimization using separable cubic modeling and matrix-free subspace minimization

2019 ◽  
Vol 75 (1) ◽  
pp. 169-205
Author(s):  
C. P. Brás ◽  
J. M. Martínez ◽  
M. Raydan
2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


Sign in / Sign up

Export Citation Format

Share Document