scholarly journals Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing - A Tale of Two Parsers Revisited

Author(s):  
Artur Kulmizev ◽  
Miryam de Lhoneux ◽  
Johannes Gontrum ◽  
Elena Fano ◽  
Joakim Nivre
Author(s):  
Ke Yan ◽  
Jie Chen ◽  
Wenhao Zhu ◽  
Xin Jin ◽  
Guannan Hu

Author(s):  
Yijia Liu ◽  
Wanxiang Che ◽  
Yuxuan Wang ◽  
Bo Zheng ◽  
Bing Qin ◽  
...  

2017 ◽  
Author(s):  
Jenna Kanerva ◽  
Juhani Luotolahti ◽  
Filip Ginter

2016 ◽  
Vol 4 ◽  
pp. 445-461 ◽  
Author(s):  
Eliyahu Kiperwasser ◽  
Yoav Goldberg

We suggest a compositional vector representation of parse trees that relies on a recursive combination of recurrent-neural network encoders. To demonstrate its effectiveness, we use the representation as the backbone of a greedy, bottom-up dependency parser, achieving very strong accuracies for English and Chinese, without relying on external word embeddings. The parser’s implementation is available for download at the first author’s webpage.


Author(s):  
Qinyuan Xiang ◽  
Weijiang Li ◽  
Hui Deng ◽  
Feng Wang

Sign in / Sign up

Export Citation Format

Share Document