On a Control Parameter Free Optimization Algorithm

Author(s):  
Lars Nolle
2021 ◽  
Vol 40 (1) ◽  
pp. 363-379
Author(s):  
Yanju Guo ◽  
Huan Shen ◽  
Lei Chen ◽  
Yu Liu ◽  
Zhilong Kang

Whale Optimization Algorithm (WOA) is a relatively novel algorithm in the field of meta-heuristic algorithms. WOA can reveal an efficient performance compared with other well-established optimization algorithms, but there is still a problem of premature convergence and easy to fall into local optimal in complex multimodal functions, so this paper presents an improved WOA, and proposes the random hopping update strategy and random control parameter strategy to improve the exploration and exploitation ability of WOA. In this paper, 24 well-known benchmark functions are used to test the algorithm, including 10 unimodal functions and 14 multimodal functions. The experimental results show that the convergence accuracy of the proposed algorithm is better than that of the original algorithm on 21 functions, and better than that of the other 5 algorithms on 23 functions.


2022 ◽  
Vol 2022 ◽  
pp. 1-35
Author(s):  
Shaomi Duan ◽  
Huilong Luo ◽  
Haipeng Liu

This article comes up with a complex-valued encoding multichain seeker optimization algorithm (CMSOA) for the engineering optimization problems. The complex-valued encoding strategy and the multichain strategy are leaded in the seeker optimization algorithm (SOA). These strategies enhance the individuals’ diversity, enhance the local search, avert falling into the local optimum, and are the influential global optimization strategies. This article chooses fifteen benchmark functions, four proportional integral derivative (PID) control parameter models, and six constrained engineering problems to test. According to the experimental results, the CMSOA can be used in the benchmark functions, in the PID control parameter optimization, and in the optimization of constrained engineering problems. Compared to the particle swarm optimization (PSO), simulated annealing based on genetic algorithm (SA_GA), gravitational search algorithm (GSA), sine cosine algorithm (SCA), multiverse optimizer (MVO), and seeker optimization algorithm (SOA), the optimization ability and robustness of the CMSOA are better than those of others algorithms.


2017 ◽  
Vol 25 (1) ◽  
pp. 113-141 ◽  
Author(s):  
Antoine S. Dymond ◽  
Schalk Kok ◽  
P. Stephan Heyns

Control parameter studies assist practitioners to select optimization algorithm parameter values that are appropriate for the problem at hand. Parameter values are well suited to a problem if they result in a search that is effective given that problem’s objective function(s), constraints, and termination criteria. Given these considerations a many-objective tuning algorithm named MOTA is presented. MOTA is specialized for tuning a stochastic optimization algorithm according to multiple performance measures, each over a range of objective function evaluation budgets. MOTA’s specialization consists of four aspects: (1) a tuning problem formulation that consists of both a speed objective and a speed decision variable; (2) a control parameter tuple assessment procedure that utilizes information from a single assessment run’s history to gauge that tuple’s performance at multiple evaluation budgets; (3) a preemptively terminating resampling strategy for handling the noise present when tuning stochastic algorithms; and (4) the use of bi-objective decomposition to assist in many-objective optimization. MOTA combines these aspects together with differential evolution operators to search for effective control parameter values. Numerical experiments consisting of tuning NSGA-II and MOEA/D demonstrate that MOTA is effective at many-objective tuning.


Sign in / Sign up

Export Citation Format

Share Document