Long-Range MIMO Channel Prediction Using Recurrent Neural Networks

Author(s):  
Wei Jiang ◽  
Mathias Strufe ◽  
Hans Dieter Schotten

2020 ◽  
Vol 1 (1) ◽  
Author(s):  
Joel Poncha Lemayian ◽  
Jehad M. Hamamreh


Author(s):  
Annie K Lamar

We investigate the generation of metrically accurate Homeric poetry using recurrent neural networks (RNN). We assess two models: a basic encoder-decoder RNN and the hierarchical recurrent encoderdecoder model (HRED). We assess the quality of the generated lines of poetry using quantitative metrical analysis and expert evaluation. This evaluation reveals that while the basic encoder-decoder is able to capture complex poetic meter, it under performs in terms of semantic coherence. The HRED model, however, produces more semantically coherent lines of poetry but is unable to capture the meter. Our research highlights the importance of expert evaluation and suggests that future research should focus on encoder-decoder models that balance various types of input – both immediate and long-range.









2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>



Sign in / Sign up

Export Citation Format

Share Document