A new method of moving asymptotes for large-scale unconstrained optimization

2008 ◽  
Vol 203 (1) ◽  
pp. 62-71 ◽  
Author(s):  
Haijun Wang ◽  
Qin Ni
2010 ◽  
Vol 27 (01) ◽  
pp. 85-101
Author(s):  
HAI-JUN WANG ◽  
QIN NI

A new method of moving asymptotes for large scale minimization subject to linear inequality constraints is discussed in this paper. In each step of the iterative process, a descend direction is obtained by solving a convex separable subproblem with dual technique. The new rules for controlling the asymptotes parameters are designed by the trust region radius and some approximation properties such that the global convergence of the new method are obtained. The numerical results show that the new method may be capable of processing some large scale problems.


Algorithms ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 227
Author(s):  
Zabidin Salleh ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Ahmad Alhawarat

The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require the second derivative, such as Newton’s method or approximations. Moreover, the conjugate gradient method can be applied in many fields such as neural networks, image restoration, etc. Many complicated methods are proposed to solve these optimization functions in two or three terms. In this paper, we propose a simple, easy, efficient, and robust conjugate gradient method. The new method is constructed based on the Liu and Storey method to overcome the convergence problem and descent property. The new modified method satisfies the convergence properties and the sufficient descent condition under some assumptions. The numerical results show that the new method outperforms famous CG methods such as CG-Descent5.3, Liu and Storey, and Dai and Liao. The numerical results include the number of iterations and CPU time.


Sign in / Sign up

Export Citation Format

Share Document