scholarly journals The Role of the Primary Sensory Cortices in Early Language Processing

2017 ◽  
Vol 29 (10) ◽  
pp. 1755-1765 ◽  
Author(s):  
Andrew C. Papanicolaou ◽  
Marina Kilintari ◽  
Roozbeh Rezaie ◽  
Shalini Narayana ◽  
Abbas Babajani-Feremi

The results of this magnetoencephalography study challenge two long-standing assumptions regarding the brain mechanisms of language processing: First, that linguistic processing proper follows sensory feature processing effected by bilateral activation of the primary sensory cortices that lasts about 100 msec from stimulus onset. Second, that subsequent linguistic processing is effected by left hemisphere networks outside the primary sensory areas, including Broca's and Wernicke's association cortices. Here we present evidence that linguistic analysis begins almost synchronously with sensory, prelinguistic verbal input analysis and that the primary cortices are also engaged in these linguistic analyses and become, consequently, part of the left hemisphere language network during language tasks. These findings call for extensive revision of our conception of linguistic processing in the brain.

2020 ◽  
Author(s):  
Micha Heilbron ◽  
Kristijan Armeni ◽  
Jan-Mathijs Schoffelen ◽  
Peter Hagoort ◽  
Floris P. de Lange

AbstractUnderstanding spoken language requires transforming ambiguous stimulus streams into a hierarchy of increasingly abstract representations, ranging from speech sounds to meaning. It has been suggested that the brain uses predictive computations to guide the interpretation of incoming information. However, the exact role of prediction in language understanding remains unclear, with widespread disagreement about both the ubiquity of prediction, and the level of representation at which predictions unfold. Here, we address both issues by analysing brain recordings of participants listening to audiobooks, and using a state-of-the-art deep neural network (GPT-2) to quantify predictions in a fine-grained, contextual fashion. First, we establish clear evidence for predictive processing, confirming that brain responses to words are modulated by probabilistic predictions. Next, we factorised the model-based predictions into distinct linguistic dimensions, revealing dissociable neural signatures of syntactic, phonemic and semantic predictions. Finally, we show that high-level (word) predictions inform low-level (phoneme) predictions, supporting theories of hierarchical predictive processing. Together, these results underscore the ubiquity of prediction in language processing, and demonstrate that linguistic prediction is not implemented by a single system but occurs throughout the language network, forming a hierarchy of linguistic predictions across all levels of analysis.


2001 ◽  
Vol 44 (4) ◽  
pp. 814-825 ◽  
Author(s):  
Christine Weber-Fox

The role of neurolinguistic factors in stuttering was investigated by determining whether individuals who stutter display atypical neural functions for language processing, even with no speech production demands. Event-related brain potentials (ERPs) were obtained while 9 individuals who stutter (IWS) and 9 normally fluent speakers (NS) read sentences silently. The ERPs were elicited by: (a) closed-class words that provide structural or grammatical information, (b) open-class words that convey referential meaning, and (c) semantic anomalies (violations in semantic expectation). In standardized tests, adult IWS displayed similar grammatical and lexical abilities in both comprehension and production tasks compared to their matched, normally fluent peers. Yet the ERPs elicited in IWS for linguistic processing tasks revealed differences in functional brain organization. The ERPs elicited in IWS were characterized by reduced negative amplitudes for closed-class words (N280), open-class words (N350), and semantic anomalies (N400) in a temporal window of approximately 200–450 ms after word onsets. The overall pattern of results indicates that alterations in processing for IWS are related to neural functions that are common to word classes and perhaps involve shared, underlying processes for lexical access.


Author(s):  
Norman D. Cook

Speech production in most people is strongly lateralized to the left hemisphere (LH), but language understanding is generally a bilateral activity. At every level of linguistic processing that has been investigated experimentally, the right hemisphere (RH) has been found to make characteristic contributions, from the processing of the affective aspects of intonation, through the appreciation of word connotations, the decoding of the meaning of metaphors and figures of speech, to the understanding of the overall coherency of verbal humour, paragraphs and short stories. If both hemispheres are indeed engaged in linguistic decoding and both processes are required to achieve a normal level of understanding, a central question concerns how the separate language functions on the left and right are integrated. This chapter reviews relevant studies on the hemispheric contributions to language processing and the role of interhemispheric communications in cognition.


1998 ◽  
Vol 172 (2) ◽  
pp. 142-146 ◽  
Author(s):  
Matthias Weisbrod ◽  
Sabine Maier ◽  
Sabine Harig ◽  
Ulrike Himmelsbach ◽  
Manfred Spitzer

BackgroundIn schizophrenia, disturbances in the development of physiological hemisphere asymmetry are assumed to play a pathogenetic role. The most striking difference between hemispheres is in language processing. The left hemisphere is superior in the use of syntactic or semantic information, whereas the right hemisphere uses contextual information more effectively.MethodUsing psycholinguistic experimental techniques, semantic associations were examined in 38 control subjects, 24 non-thought-disordered and 16 thought-disordered people with schizophrenia, for both hemispheres separately.ResultsDirect semantic priming did not differ between the hemispheres in any of the groups. Only thought-disordered people showed significant indirect semantic priming in the left hemisphere.ConclusionsThe results support: (a) a prominent role of the right hemisphere for remote associations; (b) enhanced spreading of semantic associations in thought-disordered subjects; and (c) disorganisation of the functional asymmetry of semantic processing in thought-disordered subjects.


1987 ◽  
Vol 30 (2) ◽  
pp. 261-267 ◽  
Author(s):  
Michael P. Rastatter ◽  
Cindy Lawson-Brill

The purpose of the current study was to investigate the effects of advanced aging on hemispheric organization for linguistic processing. Specifically, it was an attempt to identify whether the neurological substrate responsible for right-hemispheric language analysis diminishes in function. Measures of the influence on reaction time of the hand used to respond versus the hemisphere stimulated were obtained for a geriatric sample in an attempt to obtain an index of right-versus left-hemisphere auditory-verbal processing ability. Twenty-four right-handed geriatric subjects responded to monaurally presented verbal stimuli with their right and left hands at separate times. Reaction times were significantly faster when subjects heard the words in their right ears, regardless of the hand used to respond. Such findings were consistent with a strict model of neurolinguistic organization that suggests that the left hemisphere was responsible solely for language processing in the present group of elderly subjects. Compared to data previously gathered for young subjects, the current findings were interpreted to suggest that right-hemispheric language processing ability is inhibited in the more advanced stages of life.


2015 ◽  
Vol 27 (8) ◽  
pp. 1542-1551 ◽  
Author(s):  
Kristof Strijkers ◽  
Daisy Bertrand ◽  
Jonathan Grainger

We investigated how linguistic intention affects the time course of visual word recognition by comparing the brain's electrophysiological response to a word's lexical frequency, a well-established psycholinguistic marker of lexical access, when participants actively retrieve the meaning of the written input (semantic categorization) versus a situation where no language processing is necessary (ink color categorization). In the semantic task, the ERPs elicited by high-frequency words started to diverge from those elicited by low-frequency words as early as 120 msec after stimulus onset. On the other hand, when categorizing the colored font of the very same words in the color task, word frequency did not modulate ERPs until some 100 msec later (220 msec poststimulus onset) and did so for a shorter period and with a smaller scalp distribution. The results demonstrate that, although written words indeed elicit automatic recognition processes in the brain, the speed and quality of lexical processing critically depends on the top–down intention to engage in a linguistic task.


2013 ◽  
Vol 11 (2) ◽  
pp. 31-35
Author(s):  
Oleg Aleksandrovich Yarosh

Compound AGB-31, a monocarbamate derivative, is shown to possess a high antiepileptic activity. The mechanisms of antiepileptic action are connected with significant increase in glutamic acid decarboxylase activity in the left hemisphere of the brain, with trend of the glutamate content decrease in the left hemisphere and the tendency to increase GABA in both hemispheres. AGB-31 significantly (more than 3-fold) increases syntase nitric oxide activity in the left hemisphere and has a tendency to reduce the NO content in both hemispheres. AGB-31 significantly (by 63.4%), reduced glutathione peroxydase activity in the right hemisphere without changing it in the left, with a tendency to increase the activity of glutathione reductase in both hemispheres.


2014 ◽  
Vol 26 (12) ◽  
pp. 2840-2862 ◽  
Author(s):  
Aneta Kielar ◽  
Jed A. Meltzer ◽  
Sylvain Moreno ◽  
Claude Alain ◽  
Ellen Bialystok

EEG studies employing time–frequency analysis have revealed changes in theta and alpha power in a variety of language and memory tasks. Semantic and syntactic violations embedded in sentences evoke well-known ERPs, but little is known about the oscillatory responses to these violations. We investigated oscillatory responses to both kinds of violations, while monolingual and bilingual participants performed an acceptability judgment task. Both violations elicited power decreases (event-related desynchronization, ERD) in the 8–30 Hz frequency range, but with different scalp topographies. In addition, semantic anomalies elicited power increases (event-related synchronization, ERS) in the 1–5 Hz frequency band. The 1–5 Hz ERS was strongly phase-locked to stimulus onset and highly correlated with time domain averages, whereas the 8–30 Hz ERD response varied independently of these. In addition, the results showed that language expertise modulated 8–30 Hz ERD for syntactic violations as a function of the executive demands of the task. When the executive function demands were increased using a grammaticality judgment task, bilinguals but not monolinguals demonstrated reduced 8–30 Hz ERD for syntactic violations. These findings suggest a putative role of the 8–30 Hz ERD response as a marker of linguistic processing that likely represents a separate neural process from those underlying ERPs.


2021 ◽  
Vol 75 (1) ◽  
pp. 66-71
Author(s):  
Zh. Ibrayeva ◽  

The use of two or more languages is common in most countries of the world. However, until recently, bilingualism was considered as a factor that complicates the processing of speech, cognition and the brain. In the past 25 years there have been a surge in research on bilingualism, including the study, mastery and processing of languages, their cognitive and neural foundations, and the lifelong implications of bilingualism for cognition and the brain. Contrary to the belief that bilingualism complicates the language system, new research demonstrates that all known and used languages ​​become part of the same language system. The interactions that occur when using the two languages ​​have consequences for mind and the brain and indeed for language processing itself but these implications are not additive. Thus, bilingualism helps to uncover the fundamental architecture and language processing mechanisms that locates differently in monolingual speakers.


2021 ◽  
Vol 15 ◽  
Author(s):  
Maria V. Ivanova ◽  
Allison Zhong ◽  
And Turken ◽  
Juliana V. Baldo ◽  
Nina F. Dronkers

Current evidence strongly suggests that the arcuate fasciculus (AF) is critical for language, from spontaneous speech and word retrieval to repetition and comprehension abilities. However, to further pinpoint its unique and differential role in language, its anatomy needs to be explored in greater detail and its contribution to language processing beyond that of known cortical language areas must be established. We address this in a comprehensive evaluation of the specific functional role of the AF in a well-characterized cohort of individuals with chronic aphasia (n = 33) following left hemisphere stroke. To evaluate macro- and microstructural integrity of the AF, tractography based on the constrained spherical deconvolution model was performed. The AF in the left and right hemispheres were then manually reconstructed using a modified 3-segment model (Catani et al., 2005), and a modified 2-segment model (Glasser and Rilling, 2008). The normalized volume and a measure of microstructural integrity of the long and the posterior segments of the AF were significantly correlated with language indices while controlling for gender and lesion volume. Specific contributions of AF segments to language while accounting for the role of specific cortical language areas – inferior frontal, inferior parietal, and posterior temporal – were tested using multiple regression analyses. Involvement of the following tract segments in the left hemisphere in language processing beyond the contribution of cortical areas was demonstrated: the long segment of the AF contributed to naming abilities; anterior segment – to fluency and naming; the posterior segment – to comprehension. The results highlight the important contributions of the AF fiber pathways to language impairments beyond that of known cortical language areas. At the same time, no clear role of the right hemisphere AF tracts in language processing could be ascertained. In sum, our findings lend support to the broader role of the left AF in language processing, with particular emphasis on comprehension and naming, and point to the posterior segment of this tract as being most crucial for supporting residual language abilities.


Sign in / Sign up

Export Citation Format

Share Document