scholarly journals Energy-Efficient Gabor Kernels in Neural Networks with Genetic Algorithm Training Method

Electronics ◽  
2019 ◽  
Vol 8 (1) ◽  
pp. 105 ◽  
Author(s):  
Fanjie Meng ◽  
Xinqing Wang ◽  
Faming Shao ◽  
Dong Wang ◽  
Xia Hua

Deep-learning convolutional neural networks (CNNs) have proven to be successful in various cognitive applications with a multilayer structure. The high computational energy and time requirements hinder the practical application of CNNs; hence, the realization of a highly energy-efficient and fast-learning neural network has aroused interest. In this work, we address the computing-resource-saving problem by developing a deep model, termed the Gabor convolutional neural network (Gabor CNN), which incorporates highly expression-efficient Gabor kernels into CNNs. In order to effectively imitate the structural characteristics of traditional weight kernels, we improve upon the traditional Gabor filters, having stronger frequency and orientation representations. In addition, we propose a procedure to train Gabor CNNs, termed the fast training method (FTM). In FTM, we design a new training method based on the multipopulation genetic algorithm (MPGA) and evaluation structure to optimize improved Gabor kernels, but train the rest of the Gabor CNN parameters with back-propagation. The training of improved Gabor kernels with MPGA is much more energy-efficient with less samples and iterations. Simple tasks, like character recognition on the Mixed National Institute of Standards and Technology database (MNIST), traffic sign recognition on the German Traffic Sign Recognition Benchmark (GTSRB), and face detection on the Olivetti Research Laboratory database (ORL), are implemented using LeNet architecture. The experimental result of the Gabor CNN and MPGA training method shows a 17–19% reduction in computational energy and time and an 18–21% reduction in storage requirements with a less than 1% accuracy decrease. We eliminated a significant fraction of the computation-hungry components in the training process by incorporating highly expression-efficient Gabor kernels into CNNs.

2020 ◽  
Author(s):  
Lucas De Oliveira ◽  
Guilherme Mota ◽  
Vitor Vidal

Convolutional Neural Network is an important deep learning architecture for computer vision. Alongside with its variations, it brought image analysis applications to a new performance level. However, despite its undoubted quality, the evaluation of the performance presented in the literature is mostly restricted to accuracy measurements. So, considering the stochastic characteristics of neural networks training and the impact of the architectures configuration, research is still needed to affirm if such architectures reached the optimal configuration for their focused problems. Statistical significance is a powerful tool for a more accurate experimental evaluation of stochastic processes. This paper is dedicated to perform a thorough evaluation of kernel order influence over convolutional neural networks in the context of traffic signs recognition. Experiments for distinct kernels sizes were performed using the most well accepted database, the socalled German Traffic Sign Recognition Benchmark.


2018 ◽  
Vol 55 (12) ◽  
pp. 121009 ◽  
Author(s):  
马永杰 Ma Yongjie ◽  
李雪燕 Li Xueyan ◽  
宋晓凤 Song Xiaofeng

2020 ◽  
Vol 100 ◽  
pp. 107160 ◽  
Author(s):  
Shichao Zhou ◽  
Chenwei Deng ◽  
Zhengquan Piao ◽  
Baojun Zhao

Sign in / Sign up

Export Citation Format

Share Document