scholarly journals A Complex-Valued Encoding Multichain Seeker Optimization Algorithm for Engineering Problems

2022 ◽  
Vol 2022 ◽  
pp. 1-35
Author(s):  
Shaomi Duan ◽  
Huilong Luo ◽  
Haipeng Liu

This article comes up with a complex-valued encoding multichain seeker optimization algorithm (CMSOA) for the engineering optimization problems. The complex-valued encoding strategy and the multichain strategy are leaded in the seeker optimization algorithm (SOA). These strategies enhance the individuals’ diversity, enhance the local search, avert falling into the local optimum, and are the influential global optimization strategies. This article chooses fifteen benchmark functions, four proportional integral derivative (PID) control parameter models, and six constrained engineering problems to test. According to the experimental results, the CMSOA can be used in the benchmark functions, in the PID control parameter optimization, and in the optimization of constrained engineering problems. Compared to the particle swarm optimization (PSO), simulated annealing based on genetic algorithm (SA_GA), gravitational search algorithm (GSA), sine cosine algorithm (SCA), multiverse optimizer (MVO), and seeker optimization algorithm (SOA), the optimization ability and robustness of the CMSOA are better than those of others algorithms.

2021 ◽  
Vol 20 ◽  
pp. 173-195
Author(s):  
Shaomi Duan ◽  
Huilong Luo ◽  
Haipeng Liu

This article comes up with a complex-valued encoding seeker optimization algorithm (CSOA) base on the multi-chain method for the constrained engineering optimization problems. The complex value encoding and a multi-link strategy are leaded by the seeker optimization algorithm (SOA). The complex value encoding method is an influential global optimization strategy, and the multi-link is an enhanced local search strategy. These strategies enhance the individuals’ diversity and avert fall into the local optimum. This article chose fifteen benchmark functions, four PID control parameter models, and six constrained engineering problems to test. According to the experimental results, the CSOA algorithm can be used in the benchmark functions, PID control parameters optimization, and optimization constrained engineering problems. Compared to particle swarm optimization (PSO), simulated annealing base on genetic algorithm (SA_GA), gravitational search algorithm (GSA), sine cosine algorithm (SCA), multi-verse optimizer (MVO), and seeker optimization algorithm (SOA), the optimization ability and robustness of CSOA are better.


2022 ◽  
Vol 2022 ◽  
pp. 1-28
Author(s):  
Shaomi Duan ◽  
Huilong Luo ◽  
Haipeng Liu

To improve the seeker optimization algorithm (SOA), an elastic collision seeker optimization algorithm (ECSOA) was proposed. The ECSOA evolves some individuals in three situations: completely elastic collision, completely inelastic collision, and non-completely elastic collision. These strategies enhance the individuals’ diversity and avert falling into the local optimum. The ECSOA is compared with the particle swarm optimization (PSO), the simulated annealing and genetic algorithm (SA_GA), the gravitational search algorithm (GSA), the sine cosine algorithm (SCA), the multiverse optimizer (MVO), and the seeker optimization algorithm (SOA); then, fifteen benchmark functions, four PID control parameter models, and six constrained engineering optimization problems were selected for the experiment. According to the experimental results, the ECSOA can be used in the benchmark functions, the PID control parameter optimization, and the optimization constrained engineering problems. The optimization ability and robustness of ECSOA are better.


2021 ◽  
Vol 11 (3) ◽  
pp. 1286 ◽  
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Ali Dehghani ◽  
Om P. Malik ◽  
Ruben Morales-Menendez ◽  
...  

One of the most powerful tools for solving optimization problems is optimization algorithms (inspired by nature) based on populations. These algorithms provide a solution to a problem by randomly searching in the search space. The design’s central idea is derived from various natural phenomena, the behavior and living conditions of living organisms, laws of physics, etc. A new population-based optimization algorithm called the Binary Spring Search Algorithm (BSSA) is introduced to solve optimization problems. BSSA is an algorithm based on a simulation of the famous Hooke’s law (physics) for the traditional weights and springs system. In this proposal, the population comprises weights that are connected by unique springs. The mathematical modeling of the proposed algorithm is presented to be used to achieve solutions to optimization problems. The results were thoroughly validated in different unimodal and multimodal functions; additionally, the BSSA was compared with high-performance algorithms: binary grasshopper optimization algorithm, binary dragonfly algorithm, binary bat algorithm, binary gravitational search algorithm, binary particle swarm optimization, and binary genetic algorithm. The results show the superiority of the BSSA. The results of the Friedman test corroborate that the BSSA is more competitive.


2021 ◽  
Vol 11 (10) ◽  
pp. 4382
Author(s):  
Ali Sadeghi ◽  
Sajjad Amiri Doumari ◽  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Pavel Trojovský ◽  
...  

Optimization is the science that presents a solution among the available solutions considering an optimization problem’s limitations. Optimization algorithms have been introduced as efficient tools for solving optimization problems. These algorithms are designed based on various natural phenomena, behavior, the lifestyle of living beings, physical laws, rules of games, etc. In this paper, a new optimization algorithm called the good and bad groups-based optimizer (GBGBO) is introduced to solve various optimization problems. In GBGBO, population members update under the influence of two groups named the good group and the bad group. The good group consists of a certain number of the population members with better fitness function than other members and the bad group consists of a number of the population members with worse fitness function than other members of the population. GBGBO is mathematically modeled and its performance in solving optimization problems was tested on a set of twenty-three different objective functions. In addition, for further analysis, the results obtained from the proposed algorithm were compared with eight optimization algorithms: genetic algorithm (GA), particle swarm optimization (PSO), gravitational search algorithm (GSA), teaching–learning-based optimization (TLBO), gray wolf optimizer (GWO), and the whale optimization algorithm (WOA), tunicate swarm algorithm (TSA), and marine predators algorithm (MPA). The results show that the proposed GBGBO algorithm has a good ability to solve various optimization problems and is more competitive than other similar algorithms.


Mathematics ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1190
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Štěpán Hubálovský

There are many optimization problems in the different disciplines of science that must be solved using the appropriate method. Population-based optimization algorithms are one of the most efficient ways to solve various optimization problems. Population-based optimization algorithms are able to provide appropriate solutions to optimization problems based on a random search of the problem-solving space without the need for gradient and derivative information. In this paper, a new optimization algorithm called the Group Mean-Based Optimizer (GMBO) is presented; it can be applied to solve optimization problems in various fields of science. The main idea in designing the GMBO is to use more effectively the information of different members of the algorithm population based on two selected groups, with the titles of the good group and the bad group. Two new composite members are obtained by averaging each of these groups, which are used to update the population members. The various stages of the GMBO are described and mathematically modeled with the aim of being used to solve optimization problems. The performance of the GMBO in providing a suitable quasi-optimal solution on a set of 23 standard objective functions of different types of unimodal, high-dimensional multimodal, and fixed-dimensional multimodal is evaluated. In addition, the optimization results obtained from the proposed GMBO were compared with eight other widely used optimization algorithms, including the Marine Predators Algorithm (MPA), the Tunicate Swarm Algorithm (TSA), the Whale Optimization Algorithm (WOA), the Grey Wolf Optimizer (GWO), Teaching–Learning-Based Optimization (TLBO), the Gravitational Search Algorithm (GSA), Particle Swarm Optimization (PSO), and the Genetic Algorithm (GA). The optimization results indicated the acceptable performance of the proposed GMBO, and, based on the analysis and comparison of the results, it was determined that the GMBO is superior and much more competitive than the other eight algorithms.


2020 ◽  
Vol 10 (18) ◽  
pp. 6173 ◽  
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Gaurav Dhiman ◽  
O. P. Malik ◽  
Ruben Morales-Menendez ◽  
...  

At present, optimization algorithms are used extensively. One particular type of such algorithms includes random-based heuristic population optimization algorithms, which may be created by modeling scientific phenomena, like, for example, physical processes. The present article proposes a novel optimization algorithm based on Hooke’s law, called the spring search algorithm (SSA), which aims to solve single-objective constrained optimization problems. In the SSA, search agents are weights joined through springs, which, as Hooke’s law states, possess a force that corresponds to its length. The mathematics behind the algorithm are presented in the text. In order to test its functionality, it is executed on 38 established benchmark test functions and weighed against eight other optimization algorithms: a genetic algorithm (GA), a gravitational search algorithm (GSA), a grasshopper optimization algorithm (GOA), particle swarm optimization (PSO), teaching–learning-based optimization (TLBO), a grey wolf optimizer (GWO), a spotted hyena optimizer (SHO), as well as an emperor penguin optimizer (EPO). To test the SSA’s usability, it is employed on five engineering optimization problems. The SSA delivered better fitting results than the other algorithms in unimodal objective function, multimodal objective functions, CEC 2015, in addition to the optimization problems in engineering.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5214
Author(s):  
Mohammad Dehghani ◽  
Štěpán Hubálovský ◽  
Pavel Trojovský

Numerous optimization problems designed in different branches of science and the real world must be solved using appropriate techniques. Population-based optimization algorithms are some of the most important and practical techniques for solving optimization problems. In this paper, a new optimization algorithm called the Cat and Mouse-Based Optimizer (CMBO) is presented that mimics the natural behavior between cats and mice. In the proposed CMBO, the movement of cats towards mice as well as the escape of mice towards havens is simulated. Mathematical modeling and formulation of the proposed CMBO for implementation on optimization problems are presented. The performance of the CMBO is evaluated on a standard set of objective functions of three different types including unimodal, high-dimensional multimodal, and fixed-dimensional multimodal. The results of optimization of objective functions show that the proposed CMBO has a good ability to solve various optimization problems. Moreover, the optimization results obtained from the CMBO are compared with the performance of nine other well-known algorithms including Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), Teaching-Learning-Based Optimization (TLBO), Grey Wolf Optimizer (GWO), Whale Optimization Algorithm (WOA), Marine Predators Algorithm (MPA), Tunicate Swarm Algorithm (TSA), and Teamwork Optimization Algorithm (TOA). The performance analysis of the proposed CMBO against the compared algorithms shows that CMBO is much more competitive than other algorithms by providing more suitable quasi-optimal solutions that are closer to the global optimal.


2019 ◽  
Vol 2 (3) ◽  
pp. 508-517
Author(s):  
FerdaNur Arıcı ◽  
Ersin Kaya

Optimization is a process to search the most suitable solution for a problem within an acceptable time interval. The algorithms that solve the optimization problems are called as optimization algorithms. In the literature, there are many optimization algorithms with different characteristics. The optimization algorithms can exhibit different behaviors depending on the size, characteristics and complexity of the optimization problem. In this study, six well-known population based optimization algorithms (artificial algae algorithm - AAA, artificial bee colony algorithm - ABC, differential evolution algorithm - DE, genetic algorithm - GA, gravitational search algorithm - GSA and particle swarm optimization - PSO) were used. These six algorithms were performed on the CEC’17 test functions. According to the experimental results, the algorithms were compared and performances of the algorithms were evaluated.


Author(s):  
Umit Can ◽  
Bilal Alatas

The classical optimization algorithms are not efficient in solving complex search and optimization problems. Thus, some heuristic optimization algorithms have been proposed. In this paper, exploration of association rules within numerical databases with Gravitational Search Algorithm (GSA) has been firstly performed. GSA has been designed as search method for quantitative association rules from the databases which can be regarded as search space. Furthermore, determining the minimum values of confidence and support for every database which is a hard job has been eliminated by GSA. Apart from this, the fitness function used for GSA is very flexible. According to the interested problem, some parameters can be removed from or added to the fitness function. The range values of the attributes have been automatically adjusted during the time of mining of the rules. That is why there is not any requirements for the pre-processing of the data. Attributes interaction problem has also been eliminated with the designed GSA. GSA has been tested with four real databases and promising results have been obtained. GSA seems an effective search method for complex numerical sequential patterns mining, numerical classification rules mining, and clustering rules mining tasks of data mining.


2013 ◽  
Vol 427-429 ◽  
pp. 1934-1938
Author(s):  
Zhong Rong Zhang ◽  
Jin Peng Liu ◽  
Ke De Fei ◽  
Zhao Shan Niu

The aim is to improve the convergence of the algorithm, and increase the population diversity. Adaptively particles of groups fallen into local optimum is adjusted in order to realize global optimal. by judging groups spatial location of concentration and fitness variance. At the same time, the global factors are adjusted dynamically with the action of the current particle fitness. Four typical function optimization problems are drawn into simulation experiment. The results show that the improved particle swarm optimization algorithm is convergent, robust and accurate.


Sign in / Sign up

Export Citation Format

Share Document