Convolution Recurrent Neural Networks Based Dynamic Transboundary Air Pollution Predictiona

Author(s):  
Peijiang Zhao ◽  
Koji Zettsu
Author(s):  
Ching-Fang Lee ◽  
Chao-Tung Yang ◽  
Endah Kristiani ◽  
Yu-Tse Tsan ◽  
Wei-Cheng Chan ◽  
...  

Author(s):  
Fong Iat Hang ◽  
Simon Fong

Air pollution poses a great threat to human health, and people are paying more and more attention to the prediction of air pollution. Prediction of air pollution helps people plan for their outdoor activities and helps protect human health. In this article, long-short term memory recurrent neural networks were used to predict the future concentration of air pollutants in Macau. In addition, meteorological data and data on the concentration of air pollutants were used. Moreover, in Macau, some air quality monitoring stations have less observed data, and some AQMSs less observed data of certain types of air pollutants. Therefore, the transfer learning and pre-trained neural networks were used to assist AQMSs with less observed data to generate neural network with high prediction accuracy. In this thesis, in most cases, LSTM RNNs initialized with transfer learning methods have higher prediction accuracy, used less training time than randomly initialized recurrent neural networks.


2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


Sign in / Sign up

Export Citation Format

Share Document