A self-adaptive Harris Hawks optimization algorithm with opposition-based learning and chaotic local search strategy for global optimization and feature selection

Author(s):  
Abdelazim G. Hussien ◽  
Mohamed Amin
2019 ◽  
Vol 121 (2) ◽  
pp. 569-592
Author(s):  
Tinghuai Ma ◽  
Honghao Zhou ◽  
Dongdong Jia ◽  
Abdullah Al-Dhelaan ◽  
Mohammed Al-Dhelaan ◽  
...  

Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Lisheng Wei ◽  
Ning Wang ◽  
Huacai Lu

In order to improve the iris classification rate, a novel biogeography-based optimization algorithm (NBBO) based on local search and nonuniform variation was proposed in this paper. Firstly, the linear migration model was replaced by a hyperbolic cotangent model which was closer to the natural law. And, the local search strategy was added to traditional BBO algorithm migration operation to enhance the global search ability of the algorithm. Then, the nonuniform variation was introduced to enhance the algorithm in the later iteration. The algorithm could achieve a stronger iris classifier by lifting weaker similarity classifiers during the training stage. On this base, the convergence condition of NBBO was proposed by using the Markov chain strategy. Finally, simulation results were given to demonstrate the effectiveness and efficiency of the proposed iris classification method.


2021 ◽  
Vol 11 (11) ◽  
pp. 4837
Author(s):  
Mohamed Abdel-Basset ◽  
Reda Mohamed ◽  
Mohamed Abouhawwash ◽  
Victor Chang ◽  
S. S. Askar

This paper studies the generalized normal distribution algorithm (GNDO) performance for tackling the permutation flow shop scheduling problem (PFSSP). Because PFSSP is a discrete problem and GNDO generates continuous values, the largest ranked value rule is used to convert those continuous values into discrete ones to make GNDO applicable for solving this discrete problem. Additionally, the discrete GNDO is effectively integrated with a local search strategy to improve the quality of the best-so-far solution in an abbreviated version of HGNDO. More than that, a new improvement using the swap mutation operator applied on the best-so-far solution to avoid being stuck into local optima by accelerating the convergence speed is effectively applied to HGNDO to propose a new version, namely a hybrid-improved GNDO (HIGNDO). Last but not least, the local search strategy is improved using the scramble mutation operator to utilize each trial as ideally as possible for reaching better outcomes. This improved local search strategy is integrated with IGNDO to produce a new strong algorithm abbreviated as IHGNDO. Those proposed algorithms are extensively compared with a number of well-established optimization algorithms using various statistical analyses to estimate the optimal makespan for 41 well-known instances in a reasonable time. The findings show the benefits and speedup of both IHGNDO and HIGNDO over all the compared algorithms, in addition to HGNDO.


2021 ◽  
Vol 103 ◽  
pp. 107146
Author(s):  
Wen Long ◽  
Jianjun Jiao ◽  
Ximing Liang ◽  
Tiebin Wu ◽  
Ming Xu ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document