scholarly journals Association and not semantic relationships elicit the N400 effect: Electrophysiological evidence from an explicit language comprehension task

2007 ◽  
Vol 0 (0) ◽  
pp. 070914092401002-??? ◽  
Author(s):  
Sinéad M. Rhodes ◽  
David I. Donaldson
2018 ◽  
Vol 30 (1) ◽  
pp. 3-19 ◽  
Author(s):  
Olessia Jouravlev ◽  
Rachael Schwartz ◽  
Dima Ayyash ◽  
Zachary Mineroff ◽  
Edward Gibson ◽  
...  

When we receive information in the presence of other people, are we sensitive to what they do or do not understand? In two event-related-potential experiments, participants read implausible sentences (e.g., “The girl had a little beak”) in contexts that rendered them plausible (e.g., “The girl dressed up as a canary for Halloween”). No semantic-processing difficulty (no N400 effect) ensued when they read the sentences while alone in the room. However, when a confederate was present who did not receive the contexts so that the critical sentences were implausible for him or her, participants exhibited processing difficulty: the social-N400 effect. This effect was obtained when participants were instructed to adopt the confederate’s perspective—and critically, even without such instructions—but not when performing a demanding comprehension task. Thus, unless mental resources are limited, comprehenders engage in modeling the minds not only of those individuals with whom they directly interact but also of those individuals who are merely present during the linguistic exchange.


Author(s):  
Yuanxing Zhang ◽  
Yangbin Zhang ◽  
Kaigui Bian ◽  
Xiaoming Li

Machine reading comprehension has gained attention from both industry and academia. It is a very challenging task that involves various domains such as language comprehension, knowledge inference, summarization, etc. Previous studies mainly focus on reading comprehension on short paragraphs, and these approaches fail to perform well on the documents. In this paper, we propose a hierarchical match attention model to instruct the machine to extract answers from a specific short span of passages for the long document reading comprehension (LDRC) task. The model takes advantages from hierarchical-LSTM to learn the paragraph-level representation, and implements the match mechanism (i.e., quantifying the relationship between two contexts) to find the most appropriate paragraph that includes the hint of answers. Then the task can be decoupled into reading comprehension task for short paragraph, such that the answer can be produced. Experiments on the modified SQuAD dataset show that our proposed model outperforms existing reading comprehension models by at least 20% regarding exact match (EM), F1 and the proportion of identified paragraphs which are exactly the short paragraphs where the original answers locate.


2012 ◽  
Vol 2012 ◽  
pp. 1-8 ◽  
Author(s):  
Christina Tremblay ◽  
Oury Monchi ◽  
Carol Hudon ◽  
Joël Macoir ◽  
Laura Monetta

Depression in Parkinson's disease (PD) is frequently associated with executive deficits, which can influence nonliteral comprehension and lexical access. This study explores whether depressive symptoms in PD modulate verbal fluency and nonliteral language comprehension. Twelve individuals with PD without depressive symptoms, 13 with PD and depressive symptoms (PDDSs), and 13 healthy controls completed a semantic and phonemic verbal fluency task and an indirect speech acts comprehension task. All groups had the same performance in the phonemic fluency task while the PDDS group was impaired in the semantic task. For the indirect speech act comprehension task, no difference was observed between the groups. However, the PDDS group had difficulty answering direct speech act questions. As some language impairments in PD become apparent when depressive symptoms are associated with the disease, it would appear to be important to take the presence of depressive symptoms into account when evaluating language abilities in PD.


2007 ◽  
Vol 29 (1) ◽  
pp. 46-56 ◽  
Author(s):  
Conny F. Schmidt ◽  
Tino Zaehle ◽  
Martin Meyer ◽  
Eveline Geiser ◽  
Peter Boesiger ◽  
...  

2014 ◽  
Vol 22 (1) ◽  
pp. 128-134 ◽  
Author(s):  
Shirley-Ann Rueschemeyer ◽  
Tom Gardner ◽  
Cat Stoner

2010 ◽  
Vol 22 (6) ◽  
pp. 1165-1178 ◽  
Author(s):  
Ian FitzPatrick ◽  
Peter Indefrey

Electrophysiological studies consistently find N400 effects of semantic incongruity in nonnative (L2) language comprehension. These N400 effects are often delayed compared with native (L1) comprehension, suggesting that semantic integration in one's second language occurs later than in one's first language. In this study, we investigated whether such a delay could be attributed to (1) intralingual lexical competition and/or (2) interlingual lexical competition. We recorded EEG from Dutch–English bilinguals who listened to English (L2) sentences in which the sentence-final word was (a) semantically fitting and (b) semantically incongruent or semantically incongruent but initially congruent due to sharing initial phonemes with (c) the most probable sentence completion within the L2 or (d) the L1 translation equivalent of the most probable sentence completion. We found an N400 effect in each of the semantically incongruent conditions. This N400 effect was significantly delayed to L2 words but not to L1 translation equivalents that were initially congruent with the sentence context. Taken together, these findings firstly demonstrate that semantic integration in nonnative listening can start based on word initial phonemes (i.e., before a single lexical candidate could have been selected based on the input) and secondly suggest that spuriously elicited L1 lexical candidates are not available for semantic integration in L2 speech comprehension.


2003 ◽  
Vol 16 (4-5) ◽  
pp. 407-416 ◽  
Author(s):  
I.A Malogiannis ◽  
C Valaki ◽  
N Smyrnis ◽  
M Papathanasiou ◽  
I Evdokimidis ◽  
...  

2013 ◽  
Vol 41 (2) ◽  
pp. 462-471 ◽  
Author(s):  
CARMEN STANFIELD ◽  
REBECCA WILLIAMSON ◽  
ŞEYDA ÖZÇALIŞKAN

ABSTRACTChildren understand gesture+speech combinations in which a deictic gesture adds new information to the accompanying speech by age 1;6 (Morford & Goldin-Meadow, 1992; ‘push’+point at ball). This study explores how early children understand gesture+speech combinations in which an iconic gesture conveys additional information not found in the accompanying speech (e.g., ‘read’+BOOK gesture). Our analysis of two- to four-year-old children's responses in a gesture+speech comprehension task showed that children grasp the meaning of iconic co-speech gestures by age three and continue to improve their understanding with age. Overall, our study highlights the important role gesture plays in language comprehension as children learn to unpack increasingly complex communications addressed to them at the early ages.


Sign in / Sign up

Export Citation Format

Share Document