scholarly journals Cat and Mouse Based Optimizer: A New Nature-Inspired Optimization Algorithm

Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5214
Author(s):  
Mohammad Dehghani ◽  
Štěpán Hubálovský ◽  
Pavel Trojovský

Numerous optimization problems designed in different branches of science and the real world must be solved using appropriate techniques. Population-based optimization algorithms are some of the most important and practical techniques for solving optimization problems. In this paper, a new optimization algorithm called the Cat and Mouse-Based Optimizer (CMBO) is presented that mimics the natural behavior between cats and mice. In the proposed CMBO, the movement of cats towards mice as well as the escape of mice towards havens is simulated. Mathematical modeling and formulation of the proposed CMBO for implementation on optimization problems are presented. The performance of the CMBO is evaluated on a standard set of objective functions of three different types including unimodal, high-dimensional multimodal, and fixed-dimensional multimodal. The results of optimization of objective functions show that the proposed CMBO has a good ability to solve various optimization problems. Moreover, the optimization results obtained from the CMBO are compared with the performance of nine other well-known algorithms including Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), Teaching-Learning-Based Optimization (TLBO), Grey Wolf Optimizer (GWO), Whale Optimization Algorithm (WOA), Marine Predators Algorithm (MPA), Tunicate Swarm Algorithm (TSA), and Teamwork Optimization Algorithm (TOA). The performance analysis of the proposed CMBO against the compared algorithms shows that CMBO is much more competitive than other algorithms by providing more suitable quasi-optimal solutions that are closer to the global optimal.

Mathematics ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1190
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Štěpán Hubálovský

There are many optimization problems in the different disciplines of science that must be solved using the appropriate method. Population-based optimization algorithms are one of the most efficient ways to solve various optimization problems. Population-based optimization algorithms are able to provide appropriate solutions to optimization problems based on a random search of the problem-solving space without the need for gradient and derivative information. In this paper, a new optimization algorithm called the Group Mean-Based Optimizer (GMBO) is presented; it can be applied to solve optimization problems in various fields of science. The main idea in designing the GMBO is to use more effectively the information of different members of the algorithm population based on two selected groups, with the titles of the good group and the bad group. Two new composite members are obtained by averaging each of these groups, which are used to update the population members. The various stages of the GMBO are described and mathematically modeled with the aim of being used to solve optimization problems. The performance of the GMBO in providing a suitable quasi-optimal solution on a set of 23 standard objective functions of different types of unimodal, high-dimensional multimodal, and fixed-dimensional multimodal is evaluated. In addition, the optimization results obtained from the proposed GMBO were compared with eight other widely used optimization algorithms, including the Marine Predators Algorithm (MPA), the Tunicate Swarm Algorithm (TSA), the Whale Optimization Algorithm (WOA), the Grey Wolf Optimizer (GWO), Teaching–Learning-Based Optimization (TLBO), the Gravitational Search Algorithm (GSA), Particle Swarm Optimization (PSO), and the Genetic Algorithm (GA). The optimization results indicated the acceptable performance of the proposed GMBO, and, based on the analysis and comparison of the results, it was determined that the GMBO is superior and much more competitive than the other eight algorithms.


2021 ◽  
Vol 11 (10) ◽  
pp. 4382
Author(s):  
Ali Sadeghi ◽  
Sajjad Amiri Doumari ◽  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Pavel Trojovský ◽  
...  

Optimization is the science that presents a solution among the available solutions considering an optimization problem’s limitations. Optimization algorithms have been introduced as efficient tools for solving optimization problems. These algorithms are designed based on various natural phenomena, behavior, the lifestyle of living beings, physical laws, rules of games, etc. In this paper, a new optimization algorithm called the good and bad groups-based optimizer (GBGBO) is introduced to solve various optimization problems. In GBGBO, population members update under the influence of two groups named the good group and the bad group. The good group consists of a certain number of the population members with better fitness function than other members and the bad group consists of a number of the population members with worse fitness function than other members of the population. GBGBO is mathematically modeled and its performance in solving optimization problems was tested on a set of twenty-three different objective functions. In addition, for further analysis, the results obtained from the proposed algorithm were compared with eight optimization algorithms: genetic algorithm (GA), particle swarm optimization (PSO), gravitational search algorithm (GSA), teaching–learning-based optimization (TLBO), gray wolf optimizer (GWO), and the whale optimization algorithm (WOA), tunicate swarm algorithm (TSA), and marine predators algorithm (MPA). The results show that the proposed GBGBO algorithm has a good ability to solve various optimization problems and is more competitive than other similar algorithms.


2020 ◽  
Vol 10 (21) ◽  
pp. 7683 ◽  
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Ali Dehghani ◽  
Haidar Samet ◽  
Carlos Sotelo ◽  
...  

In recent decades, many optimization algorithms have been proposed by researchers to solve optimization problems in various branches of science. Optimization algorithms are designed based on various phenomena in nature, the laws of physics, the rules of individual and group games, the behaviors of animals, plants and other living things. Implementation of optimization algorithms on some objective functions has been successful and in others has led to failure. Improving the optimization process and adding modification phases to the optimization algorithms can lead to more acceptable and appropriate solution. In this paper, a new method called Dehghani method (DM) is introduced to improve optimization algorithms. DM effects on the location of the best member of the population using information of population location. In fact, DM shows that all members of a population, even the worst one, can contribute to the development of the population. DM has been mathematically modeled and its effect has been investigated on several optimization algorithms including: genetic algorithm (GA), particle swarm optimization (PSO), gravitational search algorithm (GSA), teaching-learning-based optimization (TLBO), and grey wolf optimizer (GWO). In order to evaluate the ability of the proposed method to improve the performance of optimization algorithms, the mentioned algorithms have been implemented in both version of original and improved by DM on a set of twenty-three standard objective functions. The simulation results show that the modified optimization algorithms with DM provide more acceptable and competitive performance than the original versions in solving optimization problems.


Author(s):  
Randa Jalaa Yahya ◽  
Nizar Hadi Abbas

A newly hybrid nature-inspired algorithm called HSSGWOA is presented with the combination of the salp swarm algorithm (SSA) and grey wolf optimizer (GWO). The major idea is to combine the salp swarm algorithm's exploitation ability with the grey wolf optimizer's exploration ability to generate both variants' strength. The proposed algorithm uses to tune the parameters of the integral sliding mode controller (ISMC) that design to improve the dynamic performance of the two-link flexible joint manipulator. The efficiency and the capability of the proposed hybrid algorithm are evaluated based on the selected test functions. It is clear that when compared to other algorithms like SSA, GWO, differential evolution (DE), gravitational search algorithm (GSA), particles swarm optimization (PSO), and whale optimization algorithm (WOA). The ISMC parameters were tuned using the SSA, which was then compared to the HSSGWOA algorithm. The simulation results show the capabilities of the proposed algorithm, which gives an enhancement percentage of 57.46% compared to the standard algorithm for one of the links, and 55.86% for the other.


2019 ◽  
Vol 2 (3) ◽  
pp. 508-517
Author(s):  
FerdaNur Arıcı ◽  
Ersin Kaya

Optimization is a process to search the most suitable solution for a problem within an acceptable time interval. The algorithms that solve the optimization problems are called as optimization algorithms. In the literature, there are many optimization algorithms with different characteristics. The optimization algorithms can exhibit different behaviors depending on the size, characteristics and complexity of the optimization problem. In this study, six well-known population based optimization algorithms (artificial algae algorithm - AAA, artificial bee colony algorithm - ABC, differential evolution algorithm - DE, genetic algorithm - GA, gravitational search algorithm - GSA and particle swarm optimization - PSO) were used. These six algorithms were performed on the CEC’17 test functions. According to the experimental results, the algorithms were compared and performances of the algorithms were evaluated.


2021 ◽  
Vol 11 (3) ◽  
pp. 1286 ◽  
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Ali Dehghani ◽  
Om P. Malik ◽  
Ruben Morales-Menendez ◽  
...  

One of the most powerful tools for solving optimization problems is optimization algorithms (inspired by nature) based on populations. These algorithms provide a solution to a problem by randomly searching in the search space. The design’s central idea is derived from various natural phenomena, the behavior and living conditions of living organisms, laws of physics, etc. A new population-based optimization algorithm called the Binary Spring Search Algorithm (BSSA) is introduced to solve optimization problems. BSSA is an algorithm based on a simulation of the famous Hooke’s law (physics) for the traditional weights and springs system. In this proposal, the population comprises weights that are connected by unique springs. The mathematical modeling of the proposed algorithm is presented to be used to achieve solutions to optimization problems. The results were thoroughly validated in different unimodal and multimodal functions; additionally, the BSSA was compared with high-performance algorithms: binary grasshopper optimization algorithm, binary dragonfly algorithm, binary bat algorithm, binary gravitational search algorithm, binary particle swarm optimization, and binary genetic algorithm. The results show the superiority of the BSSA. The results of the Friedman test corroborate that the BSSA is more competitive.


2013 ◽  
Vol 2013 ◽  
pp. 1-29 ◽  
Author(s):  
Shouheng Tuo ◽  
Longquan Yong ◽  
Tao Zhou

Harmony search (HS) algorithm is an emerging population-based metaheuristic algorithm, which is inspired by the music improvisation process. The HS method has been developed rapidly and applied widely during the past decade. In this paper, an improved global harmony search algorithm, named harmony search based on teaching-learning (HSTL), is presented for high dimension complex optimization problems. In HSTL algorithm, four strategies (harmony memory consideration, teaching-learning strategy, local pitch adjusting, and random mutation) are employed to maintain the proper balance between convergence and population diversity, and dynamic strategy is adopted to change the parameters. The proposed HSTL algorithm is investigated and compared with three other state-of-the-art HS optimization algorithms. Furthermore, to demonstrate the robustness and convergence, the success rate and convergence analysis is also studied. The experimental results of 31 complex benchmark functions demonstrate that the HSTL method has strong convergence and robustness and has better balance capacity of space exploration and local exploitation on high dimension complex optimization problems.


Electronics ◽  
2021 ◽  
Vol 10 (23) ◽  
pp. 2975
Author(s):  
Mohammad H. Nadimi-Shahraki ◽  
Shokooh Taghian ◽  
Seyedali Mirjalili ◽  
Laith Abualigah ◽  
Mohamed Abd Abd Elaziz ◽  
...  

The optimal power flow (OPF) is a vital tool for optimizing the control parameters of a power system by considering the desired objective functions subject to system constraints. Metaheuristic algorithms have been proven to be well-suited for solving complex optimization problems. The whale optimization algorithm (WOA) is one of the well-regarded metaheuristics that is widely used to solve different optimization problems. Despite the use of WOA in different fields of application as OPF, its effectiveness is decreased as the dimension size of the test system is increased. Therefore, in this paper, an effective whale optimization algorithm for solving optimal power flow problems (EWOA-OPF) is proposed. The main goal of this enhancement is to improve the exploration ability and maintain a proper balance between the exploration and exploitation of the canonical WOA. In the proposed algorithm, the movement strategy of whales is enhanced by introducing two new movement strategies: (1) encircling the prey using Levy motion and (2) searching for prey using Brownian motion that cooperate with canonical bubble-net attacking. To validate the proposed EWOA-OPF algorithm, a comparison among six well-known optimization algorithms is established to solve the OPF problem. All algorithms are used to optimize single- and multi-objective functions of the OPF under the system constraints. Standard IEEE 6-bus, IEEE 14-bus, IEEE 30-bus, and IEEE 118-bus test systems are used to evaluate the proposed EWOA-OPF and comparative algorithms for solving the OPF problem in diverse power system scale sizes. The comparison of results proves that the EWOA-OPF is able to solve single- and multi-objective OPF problems with better solutions than other comparative algorithms.


2017 ◽  
Vol 12 (1) ◽  
pp. 32 ◽  
Author(s):  
Amjad A. Hudaib ◽  
Hussam N. Fakhouri

Bio and natural phenomena inspired algorithms and meta-heuristics provide solutions to solve optimization and preliminary convergence problems. It significantly has wide effect that is integrated in many scientific fields. Thereby justifying the relevance development of many applications that relay on optimization algorithms, which allow finding the best solution in the shortest possible time. Therefore it is necessary to further consider and develop new swarm intelligence optimization algorithms. This paper proposes a novel optimization algorithm called supernova optimizer (SO) inspired by the supernova phenomena in nature. SO mimics this natural phenomena aiming to improve the three main features of optimization; exploration, exploitation, and local minima avoidance. The proposed meta-heuristic optimizer has been tested over 20 will known benchmarks functions, the results have been verified by a comparative study with the state of art optimization algorithms Grey Wolf Optimizer (GWO), A Sine Cosine Algorithm for solving optimization problems (SCA), Multi-Verse Optimizer (MVO), Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm (MFO), The Whale Optimization Algorithm (WOA), Polar Particle Swarm Optimizer (PLOARPSO) and with Particle Swarm Optimizer (PSO). The results showed that SO provided very competitive and effective results. It outperforms the best state-of-art algorithms that are compared to on the most of the tested benchmark functions.


Author(s):  
Ying-Ying Koay ◽  
Jian-Ding Tan ◽  
Chin-Wai Lim ◽  
Siaw-Paw Koh ◽  
Sieh-Kiong Tiong ◽  
...  

<span>Optimization algorithm has become one of the most studied branches in the fields of artificial intelligent and soft computing. Many powerful optimization algorithms with global search ability can be found in the literature. Gravitational Search Algorithm (GSA) is one of the relatively new population-based optimization algorithms. In this research, an Adaptive Gravitational Search Algorithm (AGSA) is proposed. The AGSA is enhanced with an adaptive search step local search mechanism. The adaptive search step begins the search with relatively larger step size, and automatically fine-tunes the step size as iterations go. This enhancement grants the algorithm a more powerful exploitation ability, which in turn grants solutions with higher accuracies. The proposed AGSA was tested in a test suit with several well-established optimization test functions. The results showed that the proposed AGSA out-performed other algorithms such as conventional GSA and Genetic Algorithm in the benchmarking of speed and accuracy. It can thus be concluded that the proposed AGSA performs well in solving local and global optimization problems. Applications of the AGSA to solve practical engineering optimization problems can be considered in the future.</span>


Sign in / Sign up

Export Citation Format

Share Document