Task Effects in Online Sentence Comprehension

2012 ◽  
Author(s):  
David Caplan ◽  
Will Evans
2006 ◽  
Vol 99 (1-2) ◽  
pp. 116-117 ◽  
Author(s):  
Christos Salis ◽  
Susan Edwards

2018 ◽  
Vol 40 (1) ◽  
pp. 3-27 ◽  
Author(s):  
EDITH KAAN ◽  
CORINNE FUTCH ◽  
RAQUEL FERNÁNDEZ FUERTES ◽  
SONJA MUJCINOVIC ◽  
ESTHER ÁLVAREZ DE LA FUENTE

AbstractPrevious research suggests that native speakers quickly adapt to the properties of the language in the surrounding context. For instance, as they repeatedly read a structure that is initially nonpreferred or infrequent, they show a reduction of processing difficulty. Adaptation has been accounted for in terms of error-based learning: the error resulting from the difference between the expected and actual input leads to an adjustment of the knowledge representation, which changes future expectations. The present study tested whether experiencing an error is sufficient for adaptation. We compared native English speakers and second language (L2) learners’ processing of, and adaptation to, two types of temporarily ambiguous structures that were resolved toward the nonpreferred interpretation. Whereas both native English and L2 speakers showed increased reading times at the disambiguating word versus a nonambiguous control, our data suggest that only native English speakers adapted, and only to one of the two structures. These results suggest that experiencing an error is not sufficient for adaptation, and that factors such as ease of revision and task effects may play a role as well.


Author(s):  
Margreet Vogelzang ◽  
Christiane M. Thiel ◽  
Stephanie Rosemann ◽  
Jochem W. Rieger ◽  
Esther Ruigendijk

Purpose Adults with mild-to-moderate age-related hearing loss typically exhibit issues with speech understanding, but their processing of syntactically complex sentences is not well understood. We test the hypothesis that listeners with hearing loss' difficulties with comprehension and processing of syntactically complex sentences are due to the processing of degraded input interfering with the successful processing of complex sentences. Method We performed a neuroimaging study with a sentence comprehension task, varying sentence complexity (through subject–object order and verb–arguments order) and cognitive demands (presence or absence of a secondary task) within subjects. Groups of older subjects with hearing loss ( n = 20) and age-matched normal-hearing controls ( n = 20) were tested. Results The comprehension data show effects of syntactic complexity and hearing ability, with normal-hearing controls outperforming listeners with hearing loss, seemingly more so on syntactically complex sentences. The secondary task did not influence off-line comprehension. The imaging data show effects of group, sentence complexity, and task, with listeners with hearing loss showing decreased activation in typical speech processing areas, such as the inferior frontal gyrus and superior temporal gyrus. No interactions between group, sentence complexity, and task were found in the neuroimaging data. Conclusions The results suggest that listeners with hearing loss process speech differently from their normal-hearing peers, possibly due to the increased demands of processing degraded auditory input. Increased cognitive demands by means of a secondary visual shape processing task influence neural sentence processing, but no evidence was found that it does so in a different way for listeners with hearing loss and normal-hearing listeners.


2019 ◽  
Vol 62 (12) ◽  
pp. 4417-4432 ◽  
Author(s):  
Carola de Beer ◽  
Jan P. de Ruiter ◽  
Martina Hielscher-Fastabend ◽  
Katharina Hogrefe

Purpose People with aphasia (PWA) use different kinds of gesture spontaneously when they communicate. Although there is evidence that the nature of the communicative task influences the linguistic performance of PWA, so far little is known about the influence of the communicative task on the production of gestures by PWA. We aimed to investigate the influence of varying communicative constraints on the production of gesture and spoken expression by PWA in comparison to persons without language impairment. Method Twenty-six PWA with varying aphasia severities and 26 control participants (CP) without language impairment participated in the study. Spoken expression and gesture production were investigated in 2 different tasks: (a) spontaneous conversation about topics of daily living and (b) a cartoon narration task, that is, retellings of short cartoon clips. The frequencies of words and gestures as well as of different gesture types produced by the participants were analyzed and tested for potential effects of group and task. Results Main results for task effects revealed that PWA and CP used more iconic gestures and pantomimes in the cartoon narration task than in spontaneous conversation. Metaphoric gestures, deictic gestures, number gestures, and emblems were more frequently used in spontaneous conversation than in cartoon narrations by both participant groups. Group effects show that, in both tasks, PWA's gesture-to-word ratios were higher than those for the CP. Furthermore, PWA produced more interactive gestures than the CP in both tasks, as well as more number gestures and pantomimes in spontaneous conversation. Conclusions The current results suggest that PWA use gestures to compensate for their verbal limitations under varying communicative constraints. The properties of the communicative task influence the use of different gesture types in people with and without aphasia. Thus, the influence of communicative constraints needs to be considered when assessing PWA's multimodal communicative abilities.


2019 ◽  
Vol 13 (3) ◽  
pp. 314-321 ◽  
Author(s):  
Hansika Kapoor ◽  
Azizuddin Khan
Keyword(s):  

2011 ◽  
Author(s):  
K. Takahashi ◽  
N. Maionchi-Pino ◽  
A. Magnan ◽  
R. Kawashima

1992 ◽  
Author(s):  
Paul Whitney ◽  
Douglas A. Waring ◽  
Desiree Hewitt
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document