scholarly journals Richer syntactic dependencies for structured language modeling

Author(s):  
C. Chelba ◽  
Peng Xu
1994 ◽  
Author(s):  
R. Schwartz ◽  
L. Nguyen ◽  
F. Kubala ◽  
G. CHou ◽  
G. Zavaliagkos ◽  
...  

2019 ◽  
Author(s):  
Chang Liu ◽  
Zhen Zhang ◽  
Pengyuan Zhang ◽  
Yonghong Yan
Keyword(s):  

2021 ◽  
Vol 11 (12) ◽  
pp. 5743
Author(s):  
Pablo Gamallo

This article describes a compositional model based on syntactic dependencies which has been designed to build contextualized word vectors, by following linguistic principles related to the concept of selectional preferences. The compositional strategy proposed in the current work has been evaluated on a syntactically controlled and multilingual dataset, and compared with Transformer BERT-like models, such as Sentence BERT, the state-of-the-art in sentence similarity. For this purpose, we created two new test datasets for Portuguese and Spanish on the basis of that defined for the English language, containing expressions with noun-verb-noun transitive constructions. The results we have obtained show that the linguistic-based compositional approach turns out to be competitive with Transformer models.


Sign in / Sign up

Export Citation Format

Share Document