A New Initialization Algorithm for Bees Algorithm

Author(s):  
Wasim A. Hussein ◽  
Shahnorbanun Sahran ◽  
Siti Norul Huda Sheikh Abdullah
2014 ◽  
Vol 23 ◽  
pp. 104-121 ◽  
Author(s):  
Wasim A. Hussein ◽  
Shahnorbanun Sahran ◽  
Siti Norul Huda Sheikh Abdullah

2021 ◽  
Author(s):  
Iman Shafieenejad ◽  
Elham Dehghan Rouzi ◽  
Jamshid Sardari ◽  
Mohammad Siami Araghi ◽  
Amirhosein Esmaeili ◽  
...  

Author(s):  
Kaushik Kumar ◽  
Divya Zindani ◽  
J. Paulo Davim

2020 ◽  
Vol 32 (12) ◽  
pp. 2557-2600
Author(s):  
Ruizhi Chen ◽  
Ling Li

Spiking neural networks (SNNs) with the event-driven manner of transmitting spikes consume ultra-low power on neuromorphic chips. However, training deep SNNs is still challenging compared to convolutional neural networks (CNNs). The SNN training algorithms have not achieved the same performance as CNNs. In this letter, we aim to understand the intrinsic limitations of SNN training to design better algorithms. First, the pros and cons of typical SNN training algorithms are analyzed. Then it is found that the spatiotemporal backpropagation algorithm (STBP) has potential in training deep SNNs due to its simplicity and fast convergence. Later, the main bottlenecks of the STBP algorithm are analyzed, and three conditions for training deep SNNs with the STBP algorithm are derived. By analyzing the connection between CNNs and SNNs, we propose a weight initialization algorithm to satisfy the three conditions. Moreover, we propose an error minimization method and a modified loss function to further improve the training performance. Experimental results show that the proposed method achieves 91.53% accuracy on the CIFAR10 data set with 1% accuracy increase over the STBP algorithm and decreases the training epochs on the MNIST data set to 15 epochs (over 13 times speed-up compared to the STBP algorithm). The proposed method also decreases classification latency by over 25 times compared to the CNN-SNN conversion algorithms. In addition, the proposed method works robustly for very deep SNNs, while the STBP algorithm fails in a 19-layer SNN.


2015 ◽  
Vol 128 (5) ◽  
pp. 13-18
Author(s):  
Duc Hoang
Keyword(s):  

Author(s):  
ZD Zhou ◽  
YQ Xie ◽  
DT Pham ◽  
S Kamsani ◽  
M Castellani

The aim of multimodal optimisation is to find significant optima of a multimodal objective function including its global optimum. Many real-world applications are multimodal optimisation problems requiring multiple optimal solutions. The Bees Algorithm is a global optimisation procedure inspired by the foraging behaviour of honeybees. In this paper, several procedures are introduced to enhance the algorithm’s capability to find multiple optima in multimodal optimisation problems. In the proposed Bees Algorithm for multimodal optimisation, dynamic colony size is permitted to automatically adapt the search effort to different objective functions. A local search approach called balanced search technique is also proposed to speed up the algorithm. In addition, two procedures of radius estimation and optima elitism are added, to respectively enhance the Bees Algorithm’s ability to locate unevenly distributed optima, and eliminate insignificant local optima. The performance of the modified Bees Algorithm is evaluated on well-known benchmark problems, and the results are compared with those obtained by several other state-of-the-art algorithms. The results indicate that the proposed algorithm inherits excellent properties from the standard Bees Algorithm, obtaining notable efficiency for solving multimodal optimisation problems due to the introduced modifications.


Sign in / Sign up

Export Citation Format

Share Document