A Predictive Maintenance Model Using Recurrent Neural Networks

Author(s):  
Alberto Rivas ◽  
Jesús M. Fraile ◽  
Pablo Chamoso ◽  
Alfonso González-Briones ◽  
Inés Sittón ◽  
...  
2022 ◽  
Vol 62 ◽  
pp. 450-462
Author(s):  
Tiago Zonta ◽  
Cristiano André da Costa ◽  
Felipe A. Zeiser ◽  
Gabriel de Oliveira Ramos ◽  
Rafael Kunst ◽  
...  

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 178891-178902 ◽  
Author(s):  
Michal Markiewicz ◽  
Maciej Wielgosz ◽  
Mikolaj Bochenski ◽  
Waldemar Tabaczynski ◽  
Tomasz Konieczny ◽  
...  

Author(s):  
Petia Koprinkova-Hristova ◽  
Mincho Hadjiski ◽  
Lyubka Doukovska ◽  
Simeon Beloreshki

2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


Sign in / Sign up

Export Citation Format

Share Document