An unconstrained optimization method using nonmonotone second order Goldstein’s line search

2007 ◽  
Vol 50 (10) ◽  
pp. 1389-1400 ◽  
Author(s):  
Wen-yu Sun ◽  
Qun-yan Zhou
2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


2005 ◽  
Vol 15 (2) ◽  
pp. 301-306 ◽  
Author(s):  
Nada Djuranovic-Milicic

In this paper an algorithm for LC1 unconstrained optimization problems, which uses the second order Dini upper directional derivative is considered. The purpose of the paper is to establish general algorithm hypotheses under which convergence occurs to optimal points. A convergence proof is given, as well as an estimate of the rate of convergence.


Filomat ◽  
2016 ◽  
Vol 30 (5) ◽  
pp. 1283-1296
Author(s):  
Keyvan Amini ◽  
Somayeh Bahrami ◽  
Shadi Amiri

In this paper, a modified BFGS algorithm is proposed to solve unconstrained optimization problems. First, based on a modified secant condition, an update formula is recommended to approximate Hessian matrix. Then thanks to the remarkable nonmonotone line search properties, an appropriate nonmonotone idea is employed. Under some mild conditions, the global convergence properties of the algorithm are established without convexity assumption on the objective function. Preliminary numerical experiments are also reported which indicate the promising behavior of the new algorithm.


2021 ◽  
Vol 2 (1) ◽  
pp. 33
Author(s):  
Nasiru Salihu ◽  
Mathew Remilekun Odekunle ◽  
Also Mohammed Saleh ◽  
Suraj Salihu

Some problems have no analytical solution or too difficult to solve by scientists, engineers, and mathematicians, so the development of numerical methods to obtain approximate solutions became necessary. Gradient methods are more efficient when the function to be minimized continuously in its first derivative. Therefore, this article presents a new hybrid Conjugate Gradient (CG) method to solve unconstrained optimization problems. The method requires the first-order derivatives but overcomes the steepest descent method’s shortcoming of slow convergence and needs not to save or compute the second-order derivatives needed by the Newton method. The CG update parameter is suggested from the Dai-Liao conjugacy condition as a convex combination of Hestenes-Stiefel and Fletcher-Revees algorithms by employing an optimal modulating choice parameterto avoid matrix storage. Numerical computation adopts an inexact line search to obtain the step-size that generates a decent property, showing that the algorithm is robust and efficient. The scheme converges globally under Wolfe line search, and it’s like is suitable in compressive sensing problems and M-tensor systems.


Sign in / Sign up

Export Citation Format

Share Document