Combining Topic Information and Structure Information in a Dynamic Language Model

Author(s):  
Pascal Wiggers ◽  
Leon Rothkrantz
Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1536
Author(s):  
Yiping Yang ◽  
Xiaohui Cui

Text classification is a fundamental research direction, aims to assign tags to text units. Recently, graph neural networks (GNN) have exhibited some excellent properties in textual information processing. Furthermore, the pre-trained language model also realized promising effects in many tasks. However, many text processing methods cannot model a single text unit’s structure or ignore the semantic features. To solve these problems and comprehensively utilize the text’s structure information and semantic information, we propose a Bert-Enhanced text Graph Neural Network model (BEGNN). For each text, we construct a text graph separately according to the co-occurrence relationship of words and use GNN to extract text features. Moreover, we employ Bert to extract semantic features. The former part can take into account the structural information, and the latter can focus on modeling the semantic information. Finally, we interact and aggregate these two features of different granularity to get a more effective representation. Experiments on standard datasets demonstrate the effectiveness of BEGNN.


2010 ◽  
Author(s):  
Hitoshi Yamamoto ◽  
Ken Hanazawa ◽  
Kiyokazu Miki ◽  
Koichi Shinoda

2015 ◽  
Vol 42 (1) ◽  
pp. 101-112 ◽  
Author(s):  
J.D. Echeverry-Correa ◽  
J. Ferreiros-López ◽  
A. Coucheiro-Limeres ◽  
R. Córdoba ◽  
J.M. Montero

2000 ◽  
Author(s):  
E. I. Sicilia-Garcia ◽  
Ji Ming ◽  
F. J. Smith

2007 ◽  
Author(s):  
Hiroki Yamazaki ◽  
Koji Iwano ◽  
Koichi Shinoda ◽  
Sadaoki Furui ◽  
Haruo Yokota

1995 ◽  
Vol 98 (2) ◽  
pp. 689-689
Author(s):  
Robert D. Strong

Author(s):  
F. Jelinek ◽  
B. Merialdo ◽  
S. Roukos ◽  
M. Strauss

Sign in / Sign up

Export Citation Format

Share Document