Nelder-Mead Evolutionary Hybrid Algorithms

Author(s):  
Sanjoy Das

Real world optimization problems are often too complex to be solved through analytic means. Evolutionary algorithms are a class of algorithms that borrow paradigms from nature to address them. These are stochastic methods of optimization that maintain a population of individual solutions, which correspond to points in the search space of the problem. These algorithms have been immensely popular as they are derivativefree techniques, are not as prone to getting trapped in local minima, and can be tailored specifically to suit any given problem. The performance of evolutionary algorithms can be improved further by adding a local search component to them. The Nelder-Mead simplex algorithm (Nelder & Mead, 1965) is a simple local search algorithm that has been routinely applied to improve the search process in evolutionary algorithms, and such a strategy has met with great success. In this article, we provide an overview of the various strategies that have been adopted to hybridize two wellknown evolutionary algorithms - genetic algorithms (GA) and particle swarm optimization (PSO).

2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Michel Vasquez ◽  
Nicolas Zufferey

Many optimization problems (from academia or industry) require the use of a local search to find a satisfying solution in a reasonable amount of time, even if the optimality is not guaranteed. Usually, local search algorithms operate in a search space which contains complete solutions (feasible or not) to the problem. In contrast, in Consistent Neighborhood Search (CNS), after each variable assignment, the conflicting variables are deleted to keep the partial solution feasible, and the search can stop when all the variables have a value. In this paper, we formally propose a new heuristic solution method, CNS, which has a search behavior between exhaustive tree search and local search working with complete solutions. We then discuss, with a unified view, the great success of some existing heuristics, which can however be considered within the CNS framework, in various fields: graph coloring, frequency assignment in telecommunication networks, vehicle fleet management with maintenance constraints, and satellite range scheduling. Moreover, some lessons are given in order to have guidelines for the adaptation of CNS to other problems.


2014 ◽  
Vol 2014 ◽  
pp. 1-23 ◽  
Author(s):  
Martins Akugbe Arasomwan ◽  
Aderemi Oluyinka Adewumi

A new local search technique is proposed and used to improve the performance of particle swarm optimization algorithms by addressing the problem of premature convergence. In the proposed local search technique, a potential particle position in the solution search space is collectively constructed by a number of randomly selected particles in the swarm. The number of times the selection is made varies with the dimension of the optimization problem and each selected particle donates the value in the location of its randomly selected dimension from its personal best. After constructing the potential particle position, some local search is done around its neighbourhood in comparison with the current swarm global best position. It is then used to replace the global best particle position if it is found to be better; otherwise no replacement is made. Using some well-studied benchmark problems with low and high dimensions, numerical simulations were used to validate the performance of the improved algorithms. Comparisons were made with four different PSO variants, two of the variants implement different local search technique while the other two do not. Results show that the improved algorithms could obtain better quality solution while demonstrating better convergence velocity and precision, stability, robustness, and global-local search ability than the competing variants.


Author(s):  
Sanjoy Das ◽  
Bijaya K. Panigrahi

Real world optimization problems are often too complex to be solved through analytical means. Evolutionary algorithms, a class of algorithms that borrow paradigms from nature, are particularly well suited to address such problems. These algorithms are stochastic methods of optimization that have become immensely popular recently, because they are derivative-free methods, are not as prone to getting trapped in local minima (as they are population based), and are shown to work well for many complex optimization problems. Although evolutionary algorithms have conventionally focussed on optimizing single objective functions, most practical problems in engineering are inherently multi-objective in nature. Multi-objective evolutionary optimization is a relatively new, and rapidly expanding area of research in evolutionary computation that looks at ways to address these problems. In this chapter, we provide an overview of some of the most significant issues in multi-objective optimization (Deb, 2001).


2016 ◽  
Vol 11 (1) ◽  
pp. 3-12 ◽  
Author(s):  
Maolong Xi ◽  
Xiaojun Wu ◽  
Xinyi Sheng ◽  
Jun Sun ◽  
Wenbo Xu

Quantum-behaved particle swarm optimization, which was motivated by analysis of particle swarm optimization and quantum system, has shown compared performance in finding the optimal solutions for many optimization problems to other evolutionary algorithms. To address the problem of premature, a local search strategy is proposed to improve the performance of quantum-behaved particle swarm optimization. In proposed local search strategy, a super particle is presented which is a collection body of randomly selected particles’ dimension information in the swarm. The selected probability of particles in swarm is different and determined by their fitness values. To minimization problems, the fitness value of one particle is smaller; the selected probability is more and will contribute more information in constructing the super particle. In addition, in order to investigate the influence on algorithm performance with different local search space, four methods of computing the local search radius are applied in local search strategy and propose four variants of local search quantum-behaved particle swarm optimization. Empirical studies on a suite of well-known benchmark functions are undertaken in order to make an overall performance comparison among the proposed methods and other quantum-behaved particle swarm optimization. The simulation results show that the proposed quantum-behaved particle swarm optimization variants have better advantages over the original quantum-behaved particle swarm optimization.


2015 ◽  
Vol 14 (04) ◽  
pp. 215-233 ◽  
Author(s):  
R. Gayatri ◽  
N. Baskar

Evolutionary computation is one of the important problems solving method frequently used by the researchers. The choice of an algorithm to optimize the problem is determined by some sort of reliability of the researcher with that technique. To overcome the limitations in individual algorithms and to achieve synergic effects, fusion or hybridization of two or more algorithms is carried out. Hybrid algorithms have gained popularity because there is no evidence that a universal optimal strategy exists for solving optimization problems. In this work, a hybrid algorithm called hybrid genetic simulated swarm (HGSS) algorithm is proposed to optimize the parameters of multi-pass turning operation. The HGSS algorithm is a fusion of genetic algorithm (GA), simulated annealing (SA) and particle swarm optimization (PSO) algorithms. The objectives of this work are (i) to explore and exploit the problem search space through hybridization, (ii) to justify that proficient hybridization of evolutionary algorithms (EAs) will yield an efficient means to solve the optimization problems. In this work, the EAs such as GA, SA and PSO are also applied to optimize parameters and results are compared with HGSS. The results of the proposed work HGSS are very effective than other algorithms.


Author(s):  
Ravichander Janapati ◽  
Ch. Balaswamy ◽  
K. Soundararajan

Localization is the key research area in wireless sensor networks. Finding the exact position of the node is known as localization. Different algorithms have been proposed. Here we consider a cooperative localization algorithm with censoring schemes using Crammer Rao bound (CRB). This censoring scheme  can improve the positioning accuracy and reduces computation complexity, traffic and latency. Particle swarm optimization (PSO) is a population based search algorithm based on the swarm intelligence like social behavior of birds, bees or a school of fishes. To improve the algorithm efficiency and localization precision, this paper presents an objective function based on the normal distribution of ranging error and a method of obtaining the search space of particles. In this paper  Distributed localization of wireless sensor networksis proposed using PSO with best censoring technique using CRB. Proposed method shows better results in terms of position accuracy, latency and complexity.  


Author(s):  
Umit Can ◽  
Bilal Alatas

The classical optimization algorithms are not efficient in solving complex search and optimization problems. Thus, some heuristic optimization algorithms have been proposed. In this paper, exploration of association rules within numerical databases with Gravitational Search Algorithm (GSA) has been firstly performed. GSA has been designed as search method for quantitative association rules from the databases which can be regarded as search space. Furthermore, determining the minimum values of confidence and support for every database which is a hard job has been eliminated by GSA. Apart from this, the fitness function used for GSA is very flexible. According to the interested problem, some parameters can be removed from or added to the fitness function. The range values of the attributes have been automatically adjusted during the time of mining of the rules. That is why there is not any requirements for the pre-processing of the data. Attributes interaction problem has also been eliminated with the designed GSA. GSA has been tested with four real databases and promising results have been obtained. GSA seems an effective search method for complex numerical sequential patterns mining, numerical classification rules mining, and clustering rules mining tasks of data mining.


2021 ◽  
Vol 11 (3) ◽  
pp. 1286 ◽  
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Ali Dehghani ◽  
Om P. Malik ◽  
Ruben Morales-Menendez ◽  
...  

One of the most powerful tools for solving optimization problems is optimization algorithms (inspired by nature) based on populations. These algorithms provide a solution to a problem by randomly searching in the search space. The design’s central idea is derived from various natural phenomena, the behavior and living conditions of living organisms, laws of physics, etc. A new population-based optimization algorithm called the Binary Spring Search Algorithm (BSSA) is introduced to solve optimization problems. BSSA is an algorithm based on a simulation of the famous Hooke’s law (physics) for the traditional weights and springs system. In this proposal, the population comprises weights that are connected by unique springs. The mathematical modeling of the proposed algorithm is presented to be used to achieve solutions to optimization problems. The results were thoroughly validated in different unimodal and multimodal functions; additionally, the BSSA was compared with high-performance algorithms: binary grasshopper optimization algorithm, binary dragonfly algorithm, binary bat algorithm, binary gravitational search algorithm, binary particle swarm optimization, and binary genetic algorithm. The results show the superiority of the BSSA. The results of the Friedman test corroborate that the BSSA is more competitive.


2018 ◽  
Vol 2018 ◽  
pp. 1-15 ◽  
Author(s):  
Octavio Camarena ◽  
Erik Cuevas ◽  
Marco Pérez-Cisneros ◽  
Fernando Fausto ◽  
Adrián González ◽  
...  

The Locust Search (LS) algorithm is a swarm-based optimization method inspired in the natural behavior of the desert locust. LS considers the inclusion of two distinctive nature-inspired search mechanism, namely, their solitary phase and social phase operators. These interesting search schemes allow LS to overcome some of the difficulties that commonly affect other similar methods, such as premature convergence and the lack of diversity on solutions. Recently, computer vision experiments in insect tracking methods have conducted to the development of more accurate locust motion models than those produced by simple behavior observations. The most distinctive characteristic of such new models is the use of probabilities to emulate the locust decision process. In this paper, a modification to the original LS algorithm, referred to as LS-II, is proposed to better handle global optimization problems. In LS-II, the locust motion model of the original algorithm is modified incorporating the main characteristics of the new biological formulations. As a result, LS-II improves its original capacities of exploration and exploitation of the search space. In order to test its performance, the proposed LS-II method is compared against several the state-of-the-art evolutionary methods considering a set of benchmark functions and engineering problems. Experimental results demonstrate the superior performance of the proposed approach in terms of solution quality and robustness.


Sign in / Sign up

Export Citation Format

Share Document