scholarly journals Natural Language Generation as Incremental Planning Under Uncertainty: Adaptive Information Presentation for Statistical Dialogue Systems

2014 ◽  
Vol 22 (5) ◽  
pp. 979-994 ◽  
Author(s):  
Verena Rieser ◽  
Oliver Lemon ◽  
Simon Keizer
Author(s):  
Fei Mi ◽  
Minlie Huang ◽  
Jiyong Zhang ◽  
Boi Faltings

Natural language generation (NLG) is an essential component of task-oriented dialogue systems. Despite the recent success of neural approaches for NLG, they are typically developed for particular domains with rich annotated training examples. In this paper, we study NLG in a low-resource setting to generate sentences in new scenarios with handful training examples. We formulate the problem from a meta-learning perspective, and propose a generalized optimization-based approach (Meta-NLG) based on the well-recognized model-agnostic meta-learning (MAML) algorithm. Meta-NLG defines a set of meta tasks, and directly incorporates the objective of adapting to new low-resource NLG tasks into the meta-learning optimization process. Extensive experiments are conducted on a large multi-domain dataset (MultiWoz) with diverse linguistic variations. We show that Meta-NLG significantly outperforms other training procedures in various low-resource configurations. We analyze the results, and demonstrate that Meta-NLG adapts extremely fast and well to low-resource situations.


2018 ◽  
Author(s):  
Bo-Hsiang Tseng ◽  
Florian Kreyssig ◽  
Paweł Budzianowski ◽  
Iñigo Casanueva ◽  
Yen-Chen Wu ◽  
...  

2015 ◽  
Author(s):  
Tsung-Hsien Wen ◽  
Milica Gasic ◽  
Nikola Mrkšić ◽  
Pei-Hao Su ◽  
David Vandyke ◽  
...  

2018 ◽  
Author(s):  
Sourab Mangrulkar ◽  
Suhani Shrivastava ◽  
Veena Thenkanidiyoor ◽  
Dileep Aroor Dinesh

Sign in / Sign up

Export Citation Format

Share Document