Escaping local minima with local derivative-free methods: a numerical investigation

Optimization ◽  
2021 ◽  
pp. 1-31
Author(s):  
Coralia Cartis ◽  
Lindon Roberts ◽  
Oliver Sheridan-Methven
Author(s):  
Sanjoy Das ◽  
Bijaya K. Panigrahi

Real world optimization problems are often too complex to be solved through analytical means. Evolutionary algorithms, a class of algorithms that borrow paradigms from nature, are particularly well suited to address such problems. These algorithms are stochastic methods of optimization that have become immensely popular recently, because they are derivative-free methods, are not as prone to getting trapped in local minima (as they are population based), and are shown to work well for many complex optimization problems. Although evolutionary algorithms have conventionally focussed on optimizing single objective functions, most practical problems in engineering are inherently multi-objective in nature. Multi-objective evolutionary optimization is a relatively new, and rapidly expanding area of research in evolutionary computation that looks at ways to address these problems. In this chapter, we provide an overview of some of the most significant issues in multi-objective optimization (Deb, 2001).


Author(s):  
Sunil Kumar ◽  
Deepak Kumar ◽  
Janak Raj Sharma ◽  
Ioannis K. Argyros

Abstract Many optimal order multiple root techniques, which use derivatives in the algorithm, have been proposed in literature. Many researchers tried to construct an optimal family of derivative-free methods for multiple roots, but they did not get success in this direction. With this as a motivation factor, here, we present a new optimal class of derivative-free methods for obtaining multiple roots of nonlinear functions. This procedure involves Traub–Steffensen iteration in the first step and Traub–Steffensen-like iteration in the second step. Efficacy is checked on a good number of relevant numerical problems that verifies the efficient convergent nature of the new methods. Moreover, we find that the new derivative-free methods are just as competent as the other existing robust methods that use derivatives.


2014 ◽  
Vol 114 ◽  
pp. 22-37 ◽  
Author(s):  
Masoud Asadollahi ◽  
Geir Nævdal ◽  
Mohsen Dadashpour ◽  
Jon Kleppe

2012 ◽  
Vol 2012 ◽  
pp. 1-15 ◽  
Author(s):  
Alicia Cordero ◽  
José L. Hueso ◽  
Eulalia Martínez ◽  
Juan R. Torregrosa

A family of derivative-free methods of seventh-order convergence for solving nonlinear equations is suggested. In the proposed methods, several linear combinations of divided differences are used in order to get a good estimation of the derivative of the given function at the different steps of the iteration. The efficiency indices of the members of this family are equal to 1.6266. Also, numerical examples are used to show the performance of the presented methods, on smooth and nonsmooth equations, and to compare with other derivative-free methods, including some optimal fourth-order ones, in the sense of Kung-Traub’s conjecture.


2011 ◽  
Author(s):  
Alicia Cordero ◽  
José L. Hueso ◽  
Eulalia Martínez ◽  
Juan R. Torregrosa ◽  
Theodore E. Simos ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document