scholarly journals Unsupervised Transfer Learning for Spoken Language Understanding in Intelligent Agents

Author(s):  
Aditya Siddhant ◽  
Anuj Goyal ◽  
Angeliki Metallinou

User interaction with voice-powered agents generates large amounts of unlabeled utterances. In this paper, we explore techniques to efficiently transfer the knowledge from these unlabeled utterances to improve model performance on Spoken Language Understanding (SLU) tasks. We use Embeddings from Language Model (ELMo) to take advantage of unlabeled data by learning contextualized word representations. Additionally, we propose ELMo-Light (ELMoL), a faster and simpler unsupervised pre-training method for SLU. Our findings suggest unsupervised pre-training on a large corpora of unlabeled utterances leads to significantly better SLU performance compared to training from scratch and it can even outperform conventional supervised transfer. Additionally, we show that the gains from unsupervised transfer techniques can be further improved by supervised transfer. The improvements are more pronounced in low resource settings and when using only 1000 labeled in-domain samples, our techniques match the performance of training from scratch on 10-15x more labeled in-domain data.

2005 ◽  
Author(s):  
Catherine Kobus ◽  
Géraldine Damnati ◽  
Lionel Delphin-Poulat ◽  
Renato de Mori

1991 ◽  
Author(s):  
Lynette Hirschman ◽  
Stephanie Seneff ◽  
David Goodine ◽  
Michael Phillips

2020 ◽  
Author(s):  
Saad Ghojaria ◽  
Rahul Kotian ◽  
Yash Sawant ◽  
Suresh Mestry

Author(s):  
Prashanth Gurunath Shivakumar ◽  
Naveen Kumar ◽  
Panayiotis Georgiou ◽  
Shrikanth Narayanan

Sign in / Sign up

Export Citation Format

Share Document