slot filling
Recently Published Documents


TOTAL DOCUMENTS

158
(FIVE YEARS 98)

H-INDEX

10
(FIVE YEARS 4)

Author(s):  
Saurav Manchanda ◽  
Mohit Sharma ◽  
George Karypis
Keyword(s):  

2021 ◽  
Vol 11 (22) ◽  
pp. 10675
Author(s):  
Yinpei Dai ◽  
Yichi Zhang ◽  
Hong Liu ◽  
Zhijian Ou ◽  
Yi Huang ◽  
...  

Slot filling is a crucial component in task-oriented dialog systems that is used to parse (user) utterances into semantic concepts called slots. An ontology is defined by the collection of slots and the values that each slot can take. The most widely used practice of treating slot filling as a sequence labeling task suffers from two main drawbacks. First, the ontology is usually pre-defined and fixed and therefore is not able to detect new labels for unseen slots. Second, the one-hot encoding of slot labels ignores the correlations between slots with similar semantics, which makes it difficult to share knowledge learned across different domains. To address these problems, we propose a new model called elastic conditional random field (eCRF), where each slot is represented by the embedding of its natural language description and modeled by a CRF layer. New slot values can be detected by eCRF whenever a language description is available for the slot. In our experiment, we show that eCRFs outperform existing models in both in-domain and cross-domain tasks, especially in predicting unseen slots and values.


2021 ◽  
pp. 1-12
Author(s):  
Pengfei Wei ◽  
Bi Zeng ◽  
Wenxiong Liao

Intent detection and slot filling are recognized as two very important tasks in a spoken language understanding (SLU) system. In order to model these two tasks at the same time, many joint models based on deep neural networks have been proposed recently and archived excellent results. In addition, graph neural network has made good achievements in the field of vision. Therefore, we combine these two advantages and propose a new joint model with a wheel-graph attention network (Wheel-GAT), which is able to model interrelated connections directly for single intent detection and slot filling. To construct a graph structure for utterances, we create intent nodes, slot nodes, and directed edges. Intent nodes can provide utterance-level semantic information for slot filling, while slot nodes can also provide local keyword information for intent detection. The two tasks promote each other and carry out end-to-end training at the same time. Experiments show that our proposed approach is superior to multiple baselines on ATIS and SNIPS datasets. Besides, we also demonstrate that using bi-directional encoder representation from transformer (BERT) model further boosts the performance of the SLU task.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Hui Yanli

Aiming at solving the problem that the recognition effect of rare slot values in spoken language is poor, which affects the accuracy of oral understanding task, a spoken language understanding method is designed based on deep learning. The local features of semantic text are extracted and classified to make the classification results match the dialogue task. An intention recognition algorithm is designed for the classification results. Each datum has a corresponding intention label to complete the task of semantic slot filling. The attention mechanism is applied to the recognition of rare slot value information, the weight of hidden state and corresponding slot characteristics are obtained, and the updated slot value is used to represent the tracking state. An auxiliary gate unit is constructed between the upper and lower slots of historical dialogue, and the word vector is trained based on deep learning to complete the task of spoken language understanding. The simulation results show that the proposed method can realize multiple rounds of man-machine spoken language. Compared with the spoken language understanding methods based on cyclic network, context information, and label decomposition, it has higher accuracy and F1 value and has higher practical application value.


Author(s):  
Shaoyong Qu ◽  
Weifeng Liu ◽  
Jianning Li ◽  
Zhangming Peng

2021 ◽  
Author(s):  
Mai Hoang Dao ◽  
Thinh Hung Truong ◽  
Dat Quoc Nguyen
Keyword(s):  

2021 ◽  
Author(s):  
Soyeon Caren Han ◽  
Siqu Long ◽  
Huichun Li ◽  
Henry Weld ◽  
Josiah Poon
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document