hybrid neural network
Recently Published Documents


TOTAL DOCUMENTS

778
(FIVE YEARS 247)

H-INDEX

37
(FIVE YEARS 6)

BIOCELL ◽  
2022 ◽  
Vol 46 (4) ◽  
pp. 1105-1121
Author(s):  
CANGZHI JIA ◽  
DONG JIN ◽  
XIN WANG ◽  
QI ZHAO

2022 ◽  
Vol 70 (3) ◽  
pp. 4829-4845
Author(s):  
Mohammad Hadwan ◽  
Basheer M. Al-Maqaleh ◽  
Fuad N. Al-Badani ◽  
Rehan Ullah Khan ◽  
Mohammed A. Al-Hagery

2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Qiang Duan ◽  
Jianhua Fan ◽  
Xianglin Wei ◽  
Chao Wang ◽  
Xiang Jiao ◽  
...  

Recognizing signals is critical for understanding the increasingly crowded wireless spectrum space in noncooperative communications. Traditional threshold or pattern recognition-based solutions are labor-intensive and error-prone. Therefore, practitioners start to apply deep learning to automatic modulation classification (AMC). However, the recognition accuracy and robustness of recently presented neural network-based proposals are still unsatisfactory, especially when the signal-to-noise ratio (SNR) is low. In this backdrop, this paper presents a hybrid neural network model, called MCBL, which combines convolutional neural network, bidirectional long-short time memory, and attention mechanism to exploit their respective capability to extract the spatial, temporal, and salient features embedded in the signal samples. After formulating the AMC problem, the three modules of our hybrid dynamic neural network are detailed. To evaluate the performance of our proposal, 10 state-of-the-art neural networks (including two latest models) are chosen as benchmarks for the comparison experiments conducted on an open radio frequency (RF) dataset. Results have shown that the recognition accuracy of MCBL can reach 93% which is the highest among the tested DNN models. At the same time, the computation efficiency and robustness of MCBL are better than existing proposals.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Yuanyao Lu ◽  
Qi Xiao ◽  
Haiyang Jiang

In recent years, deep learning has already been applied to English lip-reading. However, Chinese lip-reading starts late and lacks relevant dataset, and the recognition accuracy is not ideal. Therefore, this paper proposes a new hybrid neural network model to establish a Chinese lip-reading system. In this paper, we integrate the attention mechanism into both CNN and RNN. Specifically, we add the convolutional block attention module (CBAM) to the ResNet50 neural network, which enhances its ability to capture the small differences among the mouth patterns of similarly pronounced words in Chinese, improving the performance of feature extraction in the convolution process. We also add the time attention mechanism to the GRU neural network, which helps to extract the features among consecutive lip motion images. Considering the effects of the moments before and after on the current moment in the lip-reading process, we assign more weights to the key frames, which makes the features more representative. We further validate our model through experiments on our self-built dataset. Our experiments show that using convolutional block attention module (CBAM) in the Chinese lip-reading model can accurately recognize Chinese numbers 0–9 and some frequently used Chinese words. Compared with other lip-reading systems, our system has better performance and higher recognition accuracy.


Sign in / Sign up

Export Citation Format

Share Document