scholarly journals GLOBAL CONVERGENCE RESULTS OF A NEW THREE-TERM MEMORY GRADIENT METHOD

2004 ◽  
Vol 47 (2) ◽  
pp. 63-72
Author(s):  
Sun Qingying ◽  
Liu Xinhai
2005 ◽  
Vol 22 (04) ◽  
pp. 463-477 ◽  
Author(s):  
ZHEN-JUN SHI ◽  
JIE SHEN

We study properties of a modified memory gradient method, including the global convergence and rate of convergence. Numerical results show that modified memory gradient methods are effective in solving large-scale minimization problems.


2007 ◽  
Vol 185 (1) ◽  
pp. 681-688 ◽  
Author(s):  
Zhensheng Yu ◽  
Weiguo Zhang ◽  
Baofeng Wu

2014 ◽  
Vol 9 (5) ◽  
pp. 999-1015 ◽  
Author(s):  
Ioannis E. Livieris ◽  
Panagiotis Pintelas

Author(s):  
Mompati Koorapetse ◽  
P Kaelo ◽  
S Kooepile-Reikeletseng

In this paper, a new modified Perry-type derivative-free projection method for solving large-scale nonlinear monotone equations is presented. The method is developed by combining a modified Perry's conjugate gradient method with the hyperplane projection technique. Global convergence and numerical results of the proposed method are established. Preliminary numerical results show that the proposed method is promising and efficient compared to some existing methods in the literature.


Author(s):  
H. Xu ◽  
A. M. Rubinov ◽  
B. M. Glover

AbstractWe investigate the strict lower subdifferentiability of a real-valued function on a closed convex subset of Rn. Relations between the strict lower subdifferential, lower subdifferential, and the usual convex subdifferential are established. Furthermore, we present necessary and sufficient optimality conditions for a class of quasiconvex minimization problems in terms of lower and strict lower subdifferentials. Finally, a descent direction method is proposed and global convergence results of the consequent algorithm are obtained.


Sign in / Sign up

Export Citation Format

Share Document