scholarly journals Neural Graph Matching Networks for Chinese Short Text Matching

Author(s):  
Lu Chen ◽  
Yanbin Zhao ◽  
Boer Lyu ◽  
Lesheng Jin ◽  
Zhi Chen ◽  
...  
Author(s):  
Michelle Guo ◽  
Edward Chou ◽  
De-An Huang ◽  
Shuran Song ◽  
Serena Yeung ◽  
...  

2015 ◽  
Vol 24 (6) ◽  
pp. 849-866 ◽  
Author(s):  
Fuat Basık ◽  
Buğra Gedik ◽  
Hakan Ferhatosmanoğlu ◽  
Mert Emin Kalender

Author(s):  
Xiang Ling ◽  
Lingfei Wu ◽  
Saizhuo Wang ◽  
Tengfei Ma ◽  
Fangli Xu ◽  
...  

2020 ◽  
Author(s):  
Binxuan Huang ◽  
Han Wang ◽  
Tong Wang ◽  
Yue Liu ◽  
Yang Liu

2021 ◽  
pp. 1-13
Author(s):  
Jiawen Shi ◽  
Hong Li ◽  
Chiyu Wang ◽  
Zhicheng Pang ◽  
Jiale Zhou

Short text matching is one of the fundamental technologies in natural language processing. In previous studies, most of the text matching networks are initially designed for English text. The common approach to applying them to Chinese is segmenting each sentence into words, and then taking these words as input. However, this method often results in word segmentation errors. Chinese short text matching faces the challenges of constructing effective features and understanding the semantic relationship between two sentences. In this work, we propose a novel lexicon-based pseudo-siamese model (CL2 N), which can fully mine the information expressed in Chinese text. Instead of utilizing a character-sequence or a single word-sequence, CL2 N augments the text representation with multi-granularity information in characters and lexicons. Additionally, it integrates sentence-level features through single-sentence features as well as interactive features. Experimental studies on two Chinese text matching datasets show that our model has better performance than the state-of-the-art short text matching models, and the proposed method can solve the error propagation problem of Chinese word segmentation. Particularly, the incorporation of single-sentence features and interactive features allows the network to capture the contextual semantics and co-attentive lexical information, which contributes to our best result.


Mathematics ◽  
2021 ◽  
Vol 9 (10) ◽  
pp. 1129
Author(s):  
Shihong Chen ◽  
Tianjiao Xu

QA matching is a very important task in natural language processing, but current research on text matching focuses more on short text matching rather than long text matching. Compared with short text matching, long text matching is rich in information, but distracting information is frequent. This paper extracted question-and-answer pairs about psychological counseling to research long text QA-matching technology based on deep learning. We adjusted DSSM (Deep Structured Semantic Model) to make it suitable for the QA-matching task. Moreover, for better extraction of long text features, we also improved DSSM by enriching the text representation layer, using a bidirectional neural network and attention mechanism. The experimental results show that BiGRU–Dattention–DSSM performs better at matching questions and answers.


Author(s):  
Pooja Kudi ◽  
Amitkumar Manekar ◽  
Kavita Daware ◽  
Tejaswini Dhatrak

Sign in / Sign up

Export Citation Format

Share Document