Gated recurrent unit based recurrent neural network for remaining useful life prediction of nonlinear deterioration process

2019 ◽  
Vol 185 ◽  
pp. 372-382 ◽  
Author(s):  
Jinglong Chen ◽  
Hongjie Jing ◽  
Yuanhong Chang ◽  
Qian Liu
2017 ◽  
Vol 240 ◽  
pp. 98-109 ◽  
Author(s):  
Liang Guo ◽  
Naipeng Li ◽  
Feng Jia ◽  
Yaguo Lei ◽  
Jing Lin

Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7761
Author(s):  
Tuan-Khai Nguyen ◽  
Zahoor Ahmad ◽  
Jong-Myon Kim

In this study, a scheme of remaining useful lifetime (RUL) prognosis from raw acoustic emission (AE) data is presented to predict the concrete structure’s failure before its occurrence, thus possibly prolong its service life and minimizing the risk of accidental damage. The deterioration process is portrayed by the health indicator (HI), which is automatically constructed from raw AE data with a deep neural network pretrained and fine-tuned by a stacked autoencoder deep neural network (SAE-DNN). For the deep neural network structure to perform a more accurate construction of health indicator lines, a hit removal process with a one-class support vector machine (OC-SVM), which has not been investigated in previous studies, is proposed to extract only the hits which matter the most to the portrait of deterioration. The new set of hits is then harnessed as the training labels for the deep neural network. After the completion of the health indicator line construction, health indicators are forwarded to a long short-term memory recurrent neural network (LSTM-RNN) for the training and validation of the remaining useful life prediction, as this structure is capable of capturing the long-term dependencies, even with a limited set of data. Our prediction result shows a significant improvement in comparison with a similar scheme but without the hit removal process and other methods, such as the gated recurrent unit recurrent neural network (GRU-RNN) and the simple recurrent neural network.


2019 ◽  
Vol 1 (1) ◽  
pp. 19-27 ◽  
Author(s):  
Yimeng Zhai ◽  
Aidong Deng ◽  
Jing Li ◽  
Qiang Cheng ◽  
Wei Ren

Sign in / Sign up

Export Citation Format

Share Document