A Strategy to Define Adaptive Search Directions in Derivative-Free Unconstrained Optimization Algorithms

Author(s):  
Jose Rodriguez ◽  
Ubaldo Garcia-Palomares
2019 ◽  
Vol 53 (2) ◽  
pp. 657-666
Author(s):  
Mohammad Afzalinejad

A problem with rapidly convergent methods for unconstrained optimization like the Newton’s method is the computational difficulties arising specially from the second derivative. In this paper, a class of methods for solving unconstrained optimization problems is proposed which implicitly applies approximations to derivatives. This class of methods is based on a modified Steffensen method for finding roots of a function and attempts to make a quadratic model for the function without using the second derivative. Two methods of this kind with non-expensive computations are proposed which just use first derivative of the function. Derivative-free versions of these methods are also suggested for the cases where the gradient formulas are not available or difficult to evaluate. The theory as well as numerical examinations confirm the rapid convergence of this class of methods.


2018 ◽  
Vol 74 (4) ◽  
pp. 611-637 ◽  
Author(s):  
Jianfeng Liu ◽  
Nikolaos Ploskas ◽  
Nikolaos V. Sahinidis

2009 ◽  
Vol 20 (1) ◽  
pp. 172-191 ◽  
Author(s):  
Jorge J. Moré ◽  
Stefan M. Wild

Sign in / Sign up

Export Citation Format

Share Document