scholarly journals Reading selectively via Binary Input Gated Recurrent Unit

Author(s):  
Zhe Li ◽  
Peisong Wang ◽  
Hanqing Lu ◽  
Jian Cheng

Recurrent Neural Networks (RNNs) have shown great promise in sequence modeling tasks. Gated Recurrent Unit (GRU) is one of the most used recurrent structures, which makes a good trade-off between performance and time spent. However, its practical implementation based on soft gates only partially achieves the goal to control information flow. We can hardly explain what the network has learnt internally. Inspired by human reading, we introduce binary input gated recurrent unit (BIGRU), a GRU based model using a binary input gate instead of the reset gate in GRU. By doing so, our model can read selectively during interference. In our experiments, we show that BIGRU mainly ignores the conjunctions, adverbs and articles that do not make a big difference to the document understanding, which is meaningful for us to further understand how the network works. In addition, due to reduced interference from redundant information, our model achieves better performances than baseline GRU in all the testing tasks.

Processes ◽  
2020 ◽  
Vol 8 (9) ◽  
pp. 1155
Author(s):  
Yi-Wei Lu ◽  
Chia-Yu Hsu ◽  
Kuang-Chieh Huang

With the development of smart manufacturing, in order to detect abnormal conditions of the equipment, a large number of sensors have been used to record the variables associated with production equipment. This study focuses on the prediction of Remaining Useful Life (RUL). RUL prediction is part of predictive maintenance, which uses the development trend of the machine to predict when the machine will malfunction. High accuracy of RUL prediction not only reduces the consumption of manpower and materials, but also reduces the need for future maintenance. This study focuses on detecting faults as early as possible, before the machine needs to be replaced or repaired, to ensure the reliability of the system. It is difficult to extract meaningful features from sensor data directly. This study proposes a model based on an Autoencoder Gated Recurrent Unit (AE-GRU), in which the Autoencoder (AE) extracts the important features from the raw data and the Gated Recurrent Unit (GRU) selects the information from the sequences to forecast RUL. To evaluate the performance of the proposed AE-GRU model, an aircraft turbofan engine degradation simulation dataset provided by NASA was used and a comparison made of different recurrent neural networks. The results demonstrate that the AE-GRU is better than other recurrent neural networks, such as Long Short-Term Memory (LSTM) and GRU.


2020 ◽  
Vol 10 (4) ◽  
pp. 1273 ◽  
Author(s):  
Özlem BATUR DİNLER ◽  
Nizamettin AYDIN

Speech segment detection based on gated recurrent unit (GRU) recurrent neural networks for the Kurdish language was investigated in the present study. The novelties of the current research are the utilization of a GRU in Kurdish speech segment detection, creation of a unique database from the Kurdish language, and optimization of processing parameters for Kurdish speech segmentation. This study is the first attempt to find the optimal feature parameters of the model and to form a large Kurdish vocabulary dataset for a speech segment detection based on consonant, vowel, and silence (C/V/S) discrimination. For this purpose, four window sizes and three window types with three hybrid feature vector techniques were used to describe the phoneme boundaries. Identification of the phoneme boundaries using a GRU recurrent neural network was performed with six different classification algorithms for the C/V/S discrimination. We have demonstrated that the GRU model has achieved outstanding speech segmentation performance for characterizing Kurdish acoustic signals. The experimental findings of the present study show the significance of the segment detection of speech signals by effectively utilizing hybrid features, window sizes, window types, and classification models for Kurdish speech.


Author(s):  
Yujia Qin ◽  
Fanchao Qi ◽  
Sicong Ouyang ◽  
Zhiyuan Liu ◽  
Cheng Yang ◽  
...  

2021 ◽  
Vol 50 (2) ◽  
pp. 20200339-20200339
Author(s):  
张少宇 Shaoyu Zhang ◽  
伍春晖 Chunhui Wu ◽  
熊文渊 Wenyuan Xiong

2018 ◽  
Vol 2018 ◽  
pp. 1-7 ◽  
Author(s):  
Xuanxin Liu ◽  
Fu Xu ◽  
Yu Sun ◽  
Haiyan Zhang ◽  
Zhibo Chen

Traditional image-centered methods of plant identification could be confused due to various views, uneven illuminations, and growth cycles. To tolerate the significant intraclass variances, the convolutional recurrent neural networks (C-RNNs) are proposed for observation-centered plant identification to mimic human behaviors. The C-RNN model is composed of two components: the convolutional neural network (CNN) backbone is used as a feature extractor for images, and the recurrent neural network (RNN) units are built to synthesize multiview features from each image for final prediction. Extensive experiments are conducted to explore the best combination of CNN and RNN. All models are trained end-to-end with 1 to 3 plant images of the same observation by truncated back propagation through time. The experiments demonstrate that the combination of MobileNet and Gated Recurrent Unit (GRU) is the best trade-off of classification accuracy and computational overhead on the Flavia dataset. On the holdout test set, the mean 10-fold accuracy with 1, 2, and 3 input leaves reached 99.53%, 100.00%, and 100.00%, respectively. On the BJFU100 dataset, the C-RNN model achieves the classification rate of 99.65% by two-stage end-to-end training. The observation-centered method based on the C-RNNs shows potential to further improve plant identification accuracy.


Author(s):  
Poorna Chandra Vemula* ◽  
◽  
Santhosh Reddy Chilaka ◽  
Mullapudi Raghu Vamsi ◽  
Jonnalagadda Praveen Reddy ◽  
...  

This paper analyzes the impact of continuously changing sentiments on apparently unstable stock exchange. Right when a monetary supporter decides to buy or sell stock, his decision is very much dependent on to rise or fall in price of the stock. In this paper, we look at the possibility of using notion attitudes (good versus negative) and moreover sentiments (delight, feel sorry for, etc) isolated from finance related news or tweets to help predict stock worth turns of events. This examination uses a model-self-ruling approach to manage uncover the mysterious components of stock exchange data using distinctive significant learning techniques like Recurrent Neural Networks (RNN), Long-Short Term Memory (LSTM), and Gated Recurrent Unit (GRU).


Sign in / Sign up

Export Citation Format

Share Document