Predicting Heuristic Search Performance with PageRank Centrality in Local Optima Networks

Author(s):  
Sebastian Herrmann ◽  
Franz Rothlauf
Author(s):  
Francisco Chicano ◽  
Fabio Daolio ◽  
Gabriela Ochoa ◽  
Sébastien Vérel ◽  
Marco Tomassini ◽  
...  

2020 ◽  
Vol 28 (4) ◽  
pp. 621-641 ◽  
Author(s):  
Sarah L. Thomson ◽  
Gabriela Ochoa ◽  
Sébastien Verel ◽  
Nadarajen Veerapen

Connection patterns among Local Optima Networks (LONs) can inform heuristic design for optimisation. LON research has predominantly required complete enumeration of a fitness landscape, thereby restricting analysis to problems diminutive in size compared to real-life situations. LON sampling algorithms are therefore important. In this article, we study LON construction algorithms for the Quadratic Assignment Problem (QAP). Using machine learning, we use estimated LON features to predict search performance for competitive heuristics used in the QAP domain. The results show that by using random forest regression, LON construction algorithms produce fitness landscape features which can explain almost all search variance. We find that LON samples better relate to search than enumerated LONs do. The importance of fitness levels of sampled LONs in search predictions is crystallised. Features from LONs produced by different algorithms are combined in predictions for the first time, with promising results for this “super-sampling”: a model to predict tabu search success explained 99% of variance. Arguments are made for the use-case of each LON algorithm and for combining the exploitative process of one with the exploratory optimisation of the other.


Author(s):  
Sarah L. Thomson ◽  
Sébastien Verel ◽  
Gabriela Ochoa ◽  
Nadarajen Veerapen ◽  
Paul McMenemy

2020 ◽  
Vol 13 (6) ◽  
pp. 168-178
Author(s):  
Pyae Cho ◽  
◽  
Thi Nyunt ◽  

Differential Evolution (DE) has become an advanced, robust, and proficient alternative technique for clustering on account of their population-based stochastic and heuristic search manners. Balancing better the exploitation and exploration power of the DE algorithm is important because this ability influences the performance of the algorithm. Besides, keeping superior solutions for the initial population raises the probability of finding better solutions and the rate of convergence. In this paper, an enhanced DE algorithm is introduced for clustering to offer better cluster solutions with faster convergence. The proposed algorithm performs a modified mutation strategy to improve the DE’s search behavior and exploits Quasi-Opposition-based Learning (QBL) to choose fitter initial solutions. This mutation strategy that uses the best solution as a target solution and applies three differentials contributes to avoiding local optima trap and slow convergence. The QBL based initialization method also contributes to increasing the quality of the clustering results and convergence rate. The experimental analysis was conducted on seven real datasets from the UCI repository to evaluate the performance of the proposed clustering algorithm. The obtained results showed that the proposed algorithm achieves more compact clusters and stable solutions than the competing conventional DE variants. Moreover, the performance of the proposed algorithm was compared with the existing state of the art clustering techniques based on DE. The corresponding results also pointed out that the proposed algorithm is comparable to other DE based clustering approaches in terms of the value of the objective functions. Therefore, the proposed algorithm can be regarded as an efficient clustering tool.


Sign in / Sign up

Export Citation Format

Share Document