Active flutter suppression for a three surface transport aircraft by recurrent neural networks

Author(s):  
Mattia Mattaboni ◽  
Giuseppe Quaranta ◽  
Paolo Mantegazza
2000 ◽  
Vol 23 (6) ◽  
pp. 1030-1036 ◽  
Author(s):  
Franco Bernelli-Zazzera ◽  
Paolo Mantegazza ◽  
Giovanni Mazzoni ◽  
Matteo Rendina

2016 ◽  
Vol 2016 ◽  
pp. 1-9 ◽  
Author(s):  
Haojie Liu ◽  
Yonghui Zhao ◽  
Haiyan Hu

The paper presents a digital adaptive controller of recurrent neural networks for the active flutter suppression of a wing structure over a wide transonic range. The basic idea behind the controller is as follows. At first, the parameters of recurrent neural networks, such as the number of neurons and the learning rate, are initially determined so as to suppress the flutter under a specific flight condition in the transonic regime. Then, the controller automatically adjusts itself for a new flight condition by updating the synaptic weights of networks online via the real-time recurrent learning algorithm. Hence, the controller is able to suppress the aeroelastic instability of the wing structure over a range of flight conditions in the transonic regime. To demonstrate the effectiveness and robustness of the controller, the aeroservoelastic model of a typical fighter wing with a tip missile was established and a single-input/single-output controller was synthesized. Numerical simulations of the open/closed-loop aeroservoelastic simulations were made to demonstrate the efficacy of the adaptive controller with respect to the change of flight parameters in the transonic regime.


2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


Sign in / Sign up

Export Citation Format

Share Document