Discovery of Emergent Sorting Behavior using Swarm Intelligence and Grid-Enabled Genetic Algorithms

Author(s):  
Dimitris Kalles ◽  
Alexis Kaporis ◽  
Vassiliki Mperoukli ◽  
Anthony Chatzinouskas

The authors in this chapter use simple local comparison and swap operators and demonstrate that their repeated application ends up in sorted sequences across a range of variants, most of which are also genetically evolved. They experimentally validate a square run-time behavior for emergent sorting, suggesting that not knowing in advance which direction to sort and allowing such direction to emerge imposes a n/logn penalty over conventional techniques. The authors validate the emergent sorting algorithms via genetically searching for the most favorable parameter configuration using a grid infrastructure.

Computing ◽  
1981 ◽  
Vol 26 (1) ◽  
pp. 1-7 ◽  
Author(s):  
L. Devroye ◽  
T. Klincsek

2017 ◽  
Vol 26 (06) ◽  
pp. 1750022 ◽  
Author(s):  
Tzanetos Alexandros ◽  
Dounias Georgios

In the last decade a new variety of nature inspired optimization algorithms has been appeared. After the swarm based models, researchers turned their inspiration in nature phenomena and laws of science. In this way a new category of algorithms was born, equally effective or even sometimes superior to known algorithms for optimization problems, like genetic algorithms and swarm intelligence schemes. The present survey depicts the evolution of research on nature inspired optimization algorithms related to physical phenomena and laws of science and discusses the possibilities of using the presented approaches in a number of different applications. An attempt has been made to draw conclusions on what algorithm could be used in which different problem areas, for those approaches that this information could be extracted from the related papers studied. The paper also underlines the usage of this kind of nature inspired algorithms in industrial research problems, due to their better confrontation with optimization problems represented with nodes and edges.


2008 ◽  
Vol 19 (01) ◽  
pp. 1-13 ◽  
Author(s):  
REEMA MAHAJAN ◽  
DIETER KRANZLMÜLLER ◽  
JENS VOLKERT ◽  
ULRICH H. E. HANSMANN ◽  
SIEGFRIED HÖFINGER

Profiling tools such as gprof and ssrun are used to analyze the run-time performance of a scientific application. The profiling is done in serial and in parallel mode using MPI as the communication interface. The application is a quantum chemistry program using Hartree Fock theory and Pulays DIIS method. An extensive set of test cases is taken into account in order to reach uniform conclusions. A known problem with decreased parallel scalability can thus be narrowed down to a single subroutine responsible for the reduction in Speed Up. The critical module is analyzed and a typical pitfall with triple matrix multiplications is identified. After overhauling the critical subroutine re-examination of the run-time behavior shows significantly improved performance and markedly improved parallel scalability. The lessons learned here might be of interest to other people working in similar fields with similar problems.


Author(s):  
Michael Perscheid ◽  
Bastian Steinert ◽  
Robert Hirschfeld ◽  
Felix Geller ◽  
Michael Haupt

Robotica ◽  
2008 ◽  
Vol 26 (2) ◽  
pp. 205-217 ◽  
Author(s):  
Nosan Kwak ◽  
Gon-Woo Kim ◽  
Beom-Hee Lee

SUMMARYThe state-of-the-art FastSLAM algorithm has been shown to cause a particle depletion problem while performing simultaneous localization and mapping for mobile robots. As a result, it always produces over-confident estimates of uncertainty as time progresses. This particle depletion problem is mainly due to the resampling process in FastSLAM, which tends to eliminate particles with low weights. Therefore, the number of particles to conduct loop-closure decreases, which makes the performance of FastSLAM degenerate. The resampling process has not been thoroughly analyzed even though it is the main reason for the particle depletion problem. In this paper, standard resampling algorithms (systematic residual and partial resampling), a rank-based resampling adopting genetic algorithms are analyzed using computer simulations. Several performance measures such as the effective sample size, the number of distinct particles, estimation errors, and complexity are used for the thorough analysis of the resampling algorithms. Moreover, a new compensation technique is proposed instead of resampling to resolve the particle depletion problem in FastSLAM. In estimation errors, the compensation technique outperformed other resampling algorithms though its run-time was longer than those of others. The most appropriate time to instigate compensation to reduce the run-time was also analyzed with the diminishing number of particles.


Sign in / Sign up

Export Citation Format

Share Document