scholarly journals Linguistic syncopation: Meter-syntax alignment affects sentence comprehension and sensorimotor synchronization

Cognition ◽  
2021 ◽  
Vol 217 ◽  
pp. 104880
Author(s):  
Courtney B. Hilton ◽  
Micah B. Goldwater
2020 ◽  
Author(s):  
Courtney Hilton ◽  
Micah Goldwater

The hierarchical organization of speech rhythm into meter is thought to confer cognitive affordances for perception, memory, and motor coordination. Meter aligns with phrasal structure in systematic ways. In this paper, we show that this alignment affects the robustness of syntactic comprehension and discuss possible underlying mechanisms. In two experiments, we manipulated meter-syntax alignment while sentences with relative clause structures were either read as text (experiment 1, n = 40) or listened to as speech (experiment 2, n = 40). In experiment 2, we also measured the stability with which participants could tap in time with the metrical accents in the sentences they were comprehending. In addition to making more mistakes, sensorimotor synchronization was disrupted when syntactic cues clashed with the metrical context. We suggest that this reflects a tight coordination of top-down linguistic knowledge with the sensorimotor system to optimize comprehension.


Author(s):  
Margreet Vogelzang ◽  
Christiane M. Thiel ◽  
Stephanie Rosemann ◽  
Jochem W. Rieger ◽  
Esther Ruigendijk

Purpose Adults with mild-to-moderate age-related hearing loss typically exhibit issues with speech understanding, but their processing of syntactically complex sentences is not well understood. We test the hypothesis that listeners with hearing loss' difficulties with comprehension and processing of syntactically complex sentences are due to the processing of degraded input interfering with the successful processing of complex sentences. Method We performed a neuroimaging study with a sentence comprehension task, varying sentence complexity (through subject–object order and verb–arguments order) and cognitive demands (presence or absence of a secondary task) within subjects. Groups of older subjects with hearing loss ( n = 20) and age-matched normal-hearing controls ( n = 20) were tested. Results The comprehension data show effects of syntactic complexity and hearing ability, with normal-hearing controls outperforming listeners with hearing loss, seemingly more so on syntactically complex sentences. The secondary task did not influence off-line comprehension. The imaging data show effects of group, sentence complexity, and task, with listeners with hearing loss showing decreased activation in typical speech processing areas, such as the inferior frontal gyrus and superior temporal gyrus. No interactions between group, sentence complexity, and task were found in the neuroimaging data. Conclusions The results suggest that listeners with hearing loss process speech differently from their normal-hearing peers, possibly due to the increased demands of processing degraded auditory input. Increased cognitive demands by means of a secondary visual shape processing task influence neural sentence processing, but no evidence was found that it does so in a different way for listeners with hearing loss and normal-hearing listeners.


2011 ◽  
Author(s):  
K. Takahashi ◽  
N. Maionchi-Pino ◽  
A. Magnan ◽  
R. Kawashima

Sign in / Sign up

Export Citation Format

Share Document