A mixed-coding adaptive differential evolution for optimising the architecture and parameters of feedforward neural networks

2019 ◽  
Vol 29 (4) ◽  
pp. 262
Author(s):  
Hong Li ◽  
Li Zhang
2021 ◽  
Vol 12 (3) ◽  
pp. 149-171
Author(s):  
Rabab Bousmaha ◽  
Reda Mohamed Hamou ◽  
Abdelmalek Amine

Training feedforward neural network (FFNN) is a complex task in the supervised learning field. An FFNN trainer aims to find the best set of weights that minimizes classification error. This paper presents a new training method based on hybrid bat optimization with self-adaptive differential evolution to train the feedforward neural networks. The hybrid training algorithm combines bat and the self-adaptive differential evolution algorithm called BAT-SDE. BAT-SDE is used to better search in the solution space, which proves its effectiveness in large space solutions. The performance of the proposed approach was compared with eight evolutionary techniques and the standard momentum backpropagation and adaptive learning rate. The comparison was benchmarked and evaluated using seven bio-medical datasets and one large credit card fraud detection dataset. The results of the comparative study show that BAT-SDE outperformed other training methods in most datasets and can be an alternative to other training methods.


Mathematics ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. 69 ◽  
Author(s):  
Marco Baioletti ◽  
Gabriele Di Bari ◽  
Alfredo Milani ◽  
Valentina Poggioni

In this paper, a Neural Networks optimizer based on Self-adaptive Differential Evolution is presented. This optimizer applies mutation and crossover operators in a new way, taking into account the structure of the network according to a per layer strategy. Moreover, a new crossover called interm is proposed, and a new self-adaptive version of DE called MAB-ShaDE is suggested to reduce the number of parameters. The framework has been tested on some well-known classification problems and a comparative study on the various combinations of self-adaptive methods, mutation, and crossover operators available in literature is performed. Experimental results show that DENN reaches good performances in terms of accuracy, better than or at least comparable with those obtained by backpropagation.


Sign in / Sign up

Export Citation Format

Share Document