zero pronoun
Recently Published Documents


TOTAL DOCUMENTS

42
(FIVE YEARS 14)

H-INDEX

6
(FIVE YEARS 1)

2022 ◽  
Vol 224 ◽  
pp. 105050
Author(s):  
Shulin Zhang ◽  
Jixing Li ◽  
Yiming Yang ◽  
John Hale

2021 ◽  
Author(s):  
Shulin Zhang ◽  
Jixing Li ◽  
Yiming Yang ◽  
John Hale

Chinese is one of many languages that can drop subjects. We report an fMRI study of language comprehension processes in these "zero pronoun" cases. The fMRI data come from Chinese speakers who listened to an audiobook. We conducted both univariate GLM and multivariate pattern analysis (MVPA) on these data time-locked to each verb with a zero pronoun subject. We found increased left middle temporal gyrus activity for zero pronouns compared to overt subjects, suggesting additional effort searching for an antecedent during zero pronoun resolution. MVPA further revealed that the intended referent of a zero pronoun seems to be physically represented in the Precuneus and the Parahippocampal Gyrus shortly after its presentation. This highlights the role of memory and discourse-level processing in resolving referential expressions, including unspoken ones, in naturalistic language comprehension.


2021 ◽  
Author(s):  
Shisong Chen ◽  
Binbin Gu ◽  
Jianfeng Qu ◽  
Zhixu Li ◽  
An Liu ◽  
...  

2021 ◽  
Author(s):  
Ryokan Ri ◽  
Toshiaki Nakazawa ◽  
Yoshimasa Tsuruoka

2021 ◽  
Author(s):  
Ryuto Konno ◽  
Shun Kiyono ◽  
Yuichiroh Matsubayashi ◽  
Hiroki Ouchi ◽  
Kentaro Inui

2020 ◽  
Vol 34 (05) ◽  
pp. 8352-8359
Author(s):  
Peiqin Lin ◽  
Meng Yang

Recent neural network methods for Chinese zero pronoun resolution didn't take bidirectional attention between zero pronouns and candidate antecedents into consideration, and simply treated the task as a classification task, ignoring the relationship between different candidates of a zero pronoun. To solve these problems, we propose a Hierarchical Attention Network with Pairwise Loss (HAN-PL), for Chinese zero pronoun resolution. In the proposed HAN-PL, we design a two-layer attention model to generate more powerful representations for zero pronouns and candidate antecedents. Furthermore, we propose a novel pairwise loss by introducing the correct-antecedent similarity constraint and the pairwise-margin loss, making the learned model more discriminative. Extensive experiments have been conducted on OntoNotes 5.0 dataset, and our model achieves state-of-the-art performance in the task of Chinese zero pronoun resolution.


Author(s):  
Qingyu Yin ◽  
Weinan Zhang ◽  
Yu Zhang ◽  
Ting Liu

Sign in / Sign up

Export Citation Format

Share Document