scholarly journals Neuroplasticity of language in left-hemisphere stroke: evidence linking subsecond electrophysiology and structural connections

2016 ◽  
Author(s):  
Vitória Piai ◽  
Lars Meyer ◽  
Nina F. Dronkers ◽  
Robert T. Knight

AbstractOur understanding of neuroplasticity following stroke is predominantly based on neuroimaging measures that cannot address the subsecond neurodynamics of impaired language processing. We combined behavioral and electrophysiological measures and structural-connectivity estimates to characterize neuroplasticity underlying successful compensation of language abilities after left-hemispheric stroke. We recorded the electroencephalogram from patients with stroke lesions to the left temporal lobe and matched controls during context-driven word retrieval. Participants heard lead-in sentences that either constrained the final word (“He locked the door with the”) or not (“She walked in here with the”). The last word was shown as a picture to be named. We conducted individual-participant analyses and focused on oscillatory power as a subsecond indicator of a brain region's functional neurophysiological computations. All participants named pictures faster following constrained than unconstrained sentences, except for two patients, who had extensive damage to the left temporal lobe. Left-lateralized alpha-beta oscillatory power decreased in controls pre-picture presentation for constrained relative to unconstrained contexts. In patients, the alpha-beta power decreases were observed with the same time course as in controls but were lateralized to the intact right hemisphere. The right lateralization depended on the probability of white-matter connections between the bilateral temporal lobes. The two patients who performed poorly behaviorally showed no alpha-beta power decreases. Our findings suggest that incorporating direct measures of neural activity into investigations of neuroplasticity can provide important neural markers to help predict language recovery, assess the progress of neurorehabilitation, and delineate targets for therapeutic neuromodulation.

Neurocase ◽  
2010 ◽  
Vol 16 (4) ◽  
pp. 317-320 ◽  
Author(s):  
Nobusada Shinoura ◽  
Toshiyuki Onodera ◽  
Kotoyo Kurokawa ◽  
Masanobu Tsukada ◽  
Ryozi Yamada ◽  
...  

2011 ◽  
Vol 91 (4) ◽  
pp. 1357-1392 ◽  
Author(s):  
Angela D. Friederici

Language processing is a trait of human species. The knowledge about its neurobiological basis has been increased considerably over the past decades. Different brain regions in the left and right hemisphere have been identified to support particular language functions. Networks involving the temporal cortex and the inferior frontal cortex with a clear left lateralization were shown to support syntactic processes, whereas less lateralized temporo-frontal networks subserve semantic processes. These networks have been substantiated both by functional as well as by structural connectivity data. Electrophysiological measures indicate that within these networks syntactic processes of local structure building precede the assignment of grammatical and semantic relations in a sentence. Suprasegmental prosodic information overtly available in the acoustic language input is processed predominantly in a temporo-frontal network in the right hemisphere associated with a clear electrophysiological marker. Studies with patients suffering from lesions in the corpus callosum reveal that the posterior portion of this structure plays a crucial role in the interaction of syntactic and prosodic information during language processing.


2021 ◽  
Author(s):  
Greta Tuckute ◽  
Alexander Paunov ◽  
Hope Kean ◽  
Hannah Small ◽  
Zachary Mineroff ◽  
...  

High-level language processing is supported by a left-lateralized fronto-temporal brain network. How this network emerges ontogenetically remains debated. Given that frontal cortex in general exhibits protracted development, frontal language areas presumably emerge later and/or mature more slowly than temporal language areas. But are temporal areas necessary for the development of the language areas in the frontal lobe, or do frontal language areas instead emerge independently? We shed light on this question through a case study of an individual (EG) born without a left temporal lobe. We use fMRI methods that have been previously extensively validated for their ability to elicit robust language responses at the individual-subject level. As expected in cases of early left hemisphere (LH) damage, we find that EG has a fully functional language network in her right hemisphere (RH) and performs within the normal range on standardized language assessments. However, her RH frontal language areas have no corresponding LH homotopic areas: no reliable response to language is detected on the lateral surface of EG's left frontal lobe. However, another network implicated in high-level cognition - the domain-general multiple demand, MD, network - is robustly present in both right and left frontal lobes, suggesting that EG's left frontal cortex is capable of supporting non-linguistic cognitive functions. The existence of temporal language areas therefore appears to be a prerequisite for the emergence of the language areas in the frontal lobe.


2020 ◽  
Vol 32 (6) ◽  
pp. 1092-1103 ◽  
Author(s):  
Dan Kennedy-Higgins ◽  
Joseph T. Devlin ◽  
Helen E. Nuttall ◽  
Patti Adank

Successful perception of speech in everyday listening conditions requires effective listening strategies to overcome common acoustic distortions, such as background noise. Convergent evidence from neuroimaging and clinical studies identify activation within the temporal lobes as key to successful speech perception. However, current neurobiological models disagree on whether the left temporal lobe is sufficient for successful speech perception or whether bilateral processing is required. We addressed this issue using TMS to selectively disrupt processing in either the left or right superior temporal gyrus (STG) of healthy participants to test whether the left temporal lobe is sufficient or whether both left and right STG are essential. Participants repeated keywords from sentences presented in background noise in a speech reception threshold task while receiving online repetitive TMS separately to the left STG, right STG, or vertex or while receiving no TMS. Results show an equal drop in performance following application of TMS to either left or right STG during the task. A separate group of participants performed a visual discrimination threshold task to control for the confounding side effects of TMS. Results show no effect of TMS on the control task, supporting the notion that the results of Experiment 1 can be attributed to modulation of cortical functioning in STG rather than to side effects associated with online TMS. These results indicate that successful speech perception in everyday listening conditions requires both left and right STG and thus have ramifications for our understanding of the neural organization of spoken language processing.


2007 ◽  
Vol 19 (7) ◽  
pp. 1193-1205 ◽  
Author(s):  
Elisabet Service ◽  
Päivi Helenius ◽  
Sini Maury ◽  
Riitta Salmelin

Electrophysiological methods have been used to study the temporal sequence of syntactic and semantic processing during sentence comprehension. Two responses associated with syntactic violations are the left anterior negativity (LAN) and the P600. A response to semantic violation is the N400. Although the sources of the N400 response have been identified in the left (and right) temporal lobe, the neural signatures of the LAN and P600 have not been revealed. The present study used magnetoencephalography to localize sources of syntactic and semantic activation in Finnish sentence reading. Participants were presented with sentences that ended in normally inf lected nouns, nouns in an unacceptable case, verbs instead of nouns, or nouns that were correctly inflected but made no sense in the context. Around 400 msec, semantically anomalous last words evoked strong activation in the left superior temporal lobe with significant activation also for word class errors (N400). Weaker activation was seen for the semantic errors in the right hemisphere. Later, 600-800 msec after word onset, the strongest activation was seen to word class and morphosyntactic errors (P600). Activation was significantly weaker to semantically anomalous and correct words. The P600 syntactic activation was localized to bilateral sources in the temporal lobe, posterior to the N400 sources. The results suggest that the same general region of the superior temporal cortex gives rise to both LAN and N400 with bilateral reactivity to semantic manipulation and a left hemisphere effect to syntactic manipulation. The bilateral P600 response was sensitive to syntactic but not semantic factors.


Author(s):  
Angela D. Friederici ◽  
Noam Chomsky

An adequate description of the neural basis of language processing must consider the entire network both with respect to its structural white matter connections and the functional connectivities between the different brain regions as the information has to be sent between different language-related regions distributed across the temporal and frontal cortex. This chapter discusses the white matter fiber bundles that connect the language-relevant regions. The chapter is broken into three sections. In the first, we look at the white matter fiber tracts connecting the language-relevant regions in the frontal and temporal cortices; in the second, the ventral and dorsal pathways in the right hemisphere that connect temporal and frontal regions; and finally in the third, the two syntax-relevant and (at least) one semantic-relevant neuroanatomically-defined networks that sentence processing is based on. From this discussion, it becomes clear that online language processing requires information transfer via the long-range white matter fiber pathways that connect the language-relevant brain regions within each hemisphere and between hemispheres.


Author(s):  
Norman D. Cook

Speech production in most people is strongly lateralized to the left hemisphere (LH), but language understanding is generally a bilateral activity. At every level of linguistic processing that has been investigated experimentally, the right hemisphere (RH) has been found to make characteristic contributions, from the processing of the affective aspects of intonation, through the appreciation of word connotations, the decoding of the meaning of metaphors and figures of speech, to the understanding of the overall coherency of verbal humour, paragraphs and short stories. If both hemispheres are indeed engaged in linguistic decoding and both processes are required to achieve a normal level of understanding, a central question concerns how the separate language functions on the left and right are integrated. This chapter reviews relevant studies on the hemispheric contributions to language processing and the role of interhemispheric communications in cognition.


Neurology ◽  
1998 ◽  
Vol 51 (2) ◽  
pp. 458-464 ◽  
Author(s):  
D. Boatman ◽  
J. Hart ◽  
R. P. Lesser ◽  
N. Honeycutt ◽  
N. B. Anderson ◽  
...  

Objective: To investigate the right hemispheric speech perception capabilities of an adult right-handed patient with seizures.Methods: Consecutive, unilateral, intracarotid sodium amobarbital injections and left hemispheric electrical interference mapping were used to determine lateralization and localization of speech perception, measured as syllable discrimination.Results: Syllable discrimination remained intact after left and right intracarotid sodium amobarbital injections. Language otherwise strongly lateralized to the left hemisphere. Despite evidence of bilateral speech perception capabilities, electrical interference testing in the left posterior temporal lobe impaired syllable discrimination.Conclusions: The results suggest a functionally symmetric, parallel system in the adult brain with preferential use of left hemispheric pathways for speech perception.


1984 ◽  
Vol 2 (2) ◽  
pp. 196-221 ◽  
Author(s):  
Robert J. Zatorre

A critical review is provided of the literature on musical performance following intracarotid sodium Amytal injection and on studies of musical perception in groups of unilaterally brain-damaged persons. The sodium Amytal data suggest that both hemispheres are active in singing familiar songs, since injection into either hemisphere produces disruption of singing. Studies with brain-damaged populations generally find deficits after right-sided damage in tasks demanding processing of patterns of pitches (e. g., unfamiliar melodic sequences) as well as with differences in timbre. Damage to the right temporal lobe causes the most consistent deficits in these tasks. Damage to the left side does not impair performance on such tasks, but does cause problems when familiar tunes are involved, especially if naming or identification is required, regardless of the presence or absence of aphasia. Damage to the right hemisphere also affects performance in such cases, but not usually to the extent that left-hemisphere lesions do.


Sign in / Sign up

Export Citation Format

Share Document