An Improved Gray Wolf Optimization Algorithm Based on Tent Chaotic Initialization and Differential Evolution Strategy for Space Robot Base Attitude Undisturbed Planning

Author(s):  
Xin Ye ◽  
Jiacai Hong ◽  
Zhenghong Dong
Author(s):  
Zhou Zhou ◽  
Fangmin Li ◽  
Shuiqiao Yang

Resource optimization algorithm based on clustering and improved differential evolution strategy, as a new global optimized algorithm, has wide applications in language translation, language processing, document understanding, cloud computing, and edge computing due to high efficiency. With the development of deep learning technology and the rise of big data, the resource optimization algorithm encounters a series of challenges, such as the workload imbalance and low resource utilization. To address the preceding problems, this study proposes a novel resource optimization algorithm based on clustering and an improved differential evolution strategy (Multi-objective Task Scheduling Strategy (MTSS)). Three indexes, namely task completion time, execution cost, and workload, of virtual machines are selected and used to build the fitness function of the MTSS algorithm. At the same time, the preprocessing state is set up to cluster according to the resource and task characteristics to reduce the magnitude of their matching scale. Moreover, to solve the workload imbalance among different resource sets, local resource tasks are reallocated using the Q-value method in the MTSS strategy to achieve workload balance of global resources and improve the resource utilization rate. Experiments are carried out to evaluate the effectiveness of the proposed algorithm. Results show that the proposed algorithm outperforms other algorithms in terms of task completion time, execution cost, and workload balancing.


2021 ◽  
Vol 11 (10) ◽  
pp. 4382
Author(s):  
Ali Sadeghi ◽  
Sajjad Amiri Doumari ◽  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Pavel Trojovský ◽  
...  

Optimization is the science that presents a solution among the available solutions considering an optimization problem’s limitations. Optimization algorithms have been introduced as efficient tools for solving optimization problems. These algorithms are designed based on various natural phenomena, behavior, the lifestyle of living beings, physical laws, rules of games, etc. In this paper, a new optimization algorithm called the good and bad groups-based optimizer (GBGBO) is introduced to solve various optimization problems. In GBGBO, population members update under the influence of two groups named the good group and the bad group. The good group consists of a certain number of the population members with better fitness function than other members and the bad group consists of a number of the population members with worse fitness function than other members of the population. GBGBO is mathematically modeled and its performance in solving optimization problems was tested on a set of twenty-three different objective functions. In addition, for further analysis, the results obtained from the proposed algorithm were compared with eight optimization algorithms: genetic algorithm (GA), particle swarm optimization (PSO), gravitational search algorithm (GSA), teaching–learning-based optimization (TLBO), gray wolf optimizer (GWO), and the whale optimization algorithm (WOA), tunicate swarm algorithm (TSA), and marine predators algorithm (MPA). The results show that the proposed GBGBO algorithm has a good ability to solve various optimization problems and is more competitive than other similar algorithms.


2015 ◽  
Vol 3 (4) ◽  
pp. 365-373 ◽  
Author(s):  
Dabin Zhang ◽  
Jia Ye ◽  
Zhigang Zhou ◽  
Yuqi Luan

Abstract In order to overcome the problem of low convergence precision and easily relapsing into local extremum in fruit fly optimization algorithm (FOA), this paper adds the idea of differential evolution to fruit fly optimization algorithm so as to optimizing and a algorithm of fruit fly optimization based on differential evolution is proposed (FOADE). Adding the operating of mutation, crossover and selection of differential evolution to FOA after each iteration, which can jump out local extremum and continue to optimize. Compared to FOA, the experimental results show that FOADE has the advantages of better global searching ability, faster convergence and more precise convergence.


Sign in / Sign up

Export Citation Format

Share Document