Recurrent neural networks for complicated seismic dynamic response prediction of a slope system

2021 ◽  
pp. 106198
Author(s):  
Yu Huang ◽  
Xu Han ◽  
Liuyuan Zhao
Sensors ◽  
2021 ◽  
Vol 21 (20) ◽  
pp. 6719
Author(s):  
Omar Rodríguez-Abreo ◽  
Francisco Antonio Castillo Velásquez ◽  
Jonny Paul Zavala de Paz ◽  
José Luis Martínez Godoy ◽  
Crescencio Garcia Guendulain

In the present work, a neuronal dynamic response prediction system is shown to estimate the response of multiple systems remotely without sensors. For this, a set of Neural Networks and the response to the step of a stable system is used. Six basic characteristics of the dynamic response were extracted and used to calculate a Transfer Function equivalent to the dynamic model. A database with 1,500,000 data points was created to train the network system with the basic characteristics of the dynamic response and the Transfer Function that causes it. The contribution of this work lies in the use of Neural Network systems to estimate the behavior of any stable system, which has multiple advantages compared to typical linear regression techniques since, although the training process is offline, the estimation can perform in real time. The results show an average 2% MSE error for the set of networks. In addition, the system was tested with physical systems to observe the performance with practical examples, achieving a precise estimation of the output with an error of less than 1% for simulated systems and high performance in real signals with the typical noise associated due to the acquisition system.


2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


Author(s):  
Faisal Ladhak ◽  
Ankur Gandhe ◽  
Markus Dreyer ◽  
Lambert Mathias ◽  
Ariya Rastrow ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document