scholarly journals Who was the agent? The neural correlates of reanalysis processes during sentence comprehension

2011 ◽  
Vol 32 (11) ◽  
pp. 1775-1787 ◽  
Author(s):  
Masako Hirotani ◽  
Michiru Makuuchi ◽  
Shirley-Ann Rüschemeyer ◽  
Angela D. Friederici
Author(s):  
Ching-Fen Hsu ◽  

This study aimed at examining the ability of causal inferences and semantic priming of people with Williams syndrome (WS). Previous studies pointed out that people with WS showed deviant sentence comprehension, given advantageous lexical semantics. This study investigated the impairment in connecting words in the semantic network by using neuroimaging techniques to reveal neurological deficits in the contextual integration of people with Williams syndrome. Four types of word pairs were presented: causal, categorical, associative, and functional. Behavioural results revealed that causal word pairs required heavier cognitive processing than functional word pairs. Distinct neural correlates of semantic priming confirmed atypical semantic linkage and possible cause of impairment of contextual integration in people with WS. The findings of normal behaviours and atypical neural correlates in people with WS provide evidence of atypical development resulted from early gene mutations.


2007 ◽  
Vol 19 (4) ◽  
pp. 594-604 ◽  
Author(s):  
Claudia K. Friedrich ◽  
Sonja A. Kotz

It is still a matter of debate whether initial analysis of speech is independent of contextual influences or whether meaning can modulate word activation directly. Utilizing event-related brain potentials (ERPs), we tested the neural correlates of speech recognition by presenting sentences that ended with incomplete words, such as To light up the dark she needed her can-. Immediately following the incomplete words, subjects saw visual words that (i) matched form and meaning, such as candle; (ii) matched meaning but not form, such as lantern; (iii) matched form but not meaning, such as candy; or (iv) mismatched form and meaning, such as number. We report ERP evidence for two distinct cohorts of lexical tokens: (a) a left-lateralized effect, the P250, differentiates form-matching words (i, iii) and form-mismatching words (ii, iv); (b) a right-lateralized effect, the P220, differentiates words that match in form and/or meaning (i, ii, iii) from mismatching words (iv). Lastly, fully matching words (i) reduce the amplitude of the N400. These results accommodate bottom-up and top-down accounts of human speech recognition. They suggest that neural representations of form and meaning are activated independently early on and are integrated at a later stage during sentence comprehension.


2019 ◽  
Vol 134 ◽  
pp. 110-121 ◽  
Author(s):  
Anna Strotseva-Feinschmidt ◽  
Christine S. Schipke ◽  
Thomas C. Gunter ◽  
Jens Brauer ◽  
Angela D. Friederici

2018 ◽  
Vol 12 ◽  
Author(s):  
Shannon Sheppard ◽  
Kevin Kim ◽  
Lynsey Keator ◽  
Bonnie Breining ◽  
Donna Tippett ◽  
...  

Author(s):  
Margreet Vogelzang ◽  
Christiane M. Thiel ◽  
Stephanie Rosemann ◽  
Jochem W. Rieger ◽  
Esther Ruigendijk

Purpose Adults with mild-to-moderate age-related hearing loss typically exhibit issues with speech understanding, but their processing of syntactically complex sentences is not well understood. We test the hypothesis that listeners with hearing loss' difficulties with comprehension and processing of syntactically complex sentences are due to the processing of degraded input interfering with the successful processing of complex sentences. Method We performed a neuroimaging study with a sentence comprehension task, varying sentence complexity (through subject–object order and verb–arguments order) and cognitive demands (presence or absence of a secondary task) within subjects. Groups of older subjects with hearing loss ( n = 20) and age-matched normal-hearing controls ( n = 20) were tested. Results The comprehension data show effects of syntactic complexity and hearing ability, with normal-hearing controls outperforming listeners with hearing loss, seemingly more so on syntactically complex sentences. The secondary task did not influence off-line comprehension. The imaging data show effects of group, sentence complexity, and task, with listeners with hearing loss showing decreased activation in typical speech processing areas, such as the inferior frontal gyrus and superior temporal gyrus. No interactions between group, sentence complexity, and task were found in the neuroimaging data. Conclusions The results suggest that listeners with hearing loss process speech differently from their normal-hearing peers, possibly due to the increased demands of processing degraded auditory input. Increased cognitive demands by means of a secondary visual shape processing task influence neural sentence processing, but no evidence was found that it does so in a different way for listeners with hearing loss and normal-hearing listeners.


2016 ◽  
Vol 21 (1) ◽  
pp. 33-43 ◽  
Author(s):  
Sofia Ribeirinho Leite ◽  
Cory David Barker ◽  
Marc G. Lucas

Sign in / Sign up

Export Citation Format

Share Document