Learning Frame-Level Recurrent Neural Networks Representations for Query-by-Example Spoken Term Detection on Mobile Devices

Author(s):  
Ziwei Zhu ◽  
Zhiyong Wu ◽  
Runnan Li ◽  
Yishuang Ning ◽  
Helen Meng
Author(s):  
Juan Manuel Rodriguez ◽  
Alejandro Zunino ◽  
Antonela Tommasel ◽  
Cristian Mateos

Nowadays, mobile devices are ubiquitous in modern life as they allow users to perform virtually any task, from checking e-mails to playing video games. However, many of these operations are conditioned by the state of mobile devices. Therefore, knowing the current state of mobile devices and predicting their future states is a crucial issue in different domains, such as context-aware applications or ad-hoc networking. Several authors have proposed to use different machine learning methods for predicting some aspect of mobile devices' future states. This chapter aims at predicting mobile devices' battery charge, whether it is plugged to A/C, and screen and WiFi state. To fulfil this goal, the current state of a mobile device can be regarded as the consequence of the previous sequence of states, meaning that future states can be predicted by known previous ones. This chapter focuses on using recurrent neural networks for predicting future states.


Author(s):  
Juan Manuel Rodriguez ◽  
Alejandro Zunino ◽  
Antonela Tommasel ◽  
Cristian Mateos

Nowadays, mobile devices are ubiquitous in modern life as they allow users to perform virtually any task, from checking e-mails to playing video games. However, many of these operations are conditioned by the state of mobile devices. Therefore, knowing the current state of mobile devices and predicting their future states is a crucial issue in different domains, such as context-aware applications or ad-hoc networking. Several authors have proposed to use different machine learning methods for predicting some aspect of mobile devices' future states. This work aims at predicting mobile devices' battery charge, whether it is plugged to A/C, and screen and WiFi state. To fulfil this goal, the current state of a mobile device can be regarded as the consequence of the previous sequence of states, meaning that future states can be predicted by known previous ones. This work focuses on using Recurrent Neural Networks for predicting future states.


2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


Author(s):  
Faisal Ladhak ◽  
Ankur Gandhe ◽  
Markus Dreyer ◽  
Lambert Mathias ◽  
Ariya Rastrow ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document