scholarly journals Dynamics of functional networks for syllable and word-level processing

2019 ◽  
Author(s):  
J.M. Rimmele ◽  
Y. Sun ◽  
G. Michalareas ◽  
O. Ghitza ◽  
D. Poeppel

AbstractSpeech comprehension requires the ability to temporally segment the acoustic input for higher-level linguistic analysis. Oscillation-based approaches suggest that low-frequency auditory cortex oscillations track syllable-sized acoustic information and therefore emphasize the relevance of syllabic-level processing for speech segmentation. Most linguistic approaches, however, focus on mapping from acoustic-phonemic representations to the lexical level. How syllabic processing interacts with higher levels of speech processing, beyond segmentation, including the anatomical and neurophysiological characteristics of the networks involved, is debated. Here we investigate the effects of lexical processing and the interactions with (acoustic) syllable processing by examining MEG data recorded in two experiments using a frequency-tagging paradigm. Participants listened to disyllabic words presented at a rate of 4 syllables/sec. Two conjectures were evaluated: (i) lexical processing of words activates a network that interacts with syllable processing; and (ii) syllable transitions contribute to word-level processing. We show that lexical content activated a left-lateralized frontal and superior and middle temporal network and increased the interaction between left middle temporal areas and auditory cortex (phase-phase coupling). Mere syllable-transition information, in contrast, activated a bilateral superior-, middle temporal and inferior frontal network and increased the interaction between those areas. Word and syllable processing interacted in superior and middle temporal areas (cross-frequency coupling), whereas syllable tracking (cerebro-acoustic coherence) decreased when word-level information was present. The data provide a new perspective on speech comprehension by demonstrating a contribution of an acoustic-syllabic to lexical processing route.Significance statementThe comprehension of speech requires integrating information at multiple time scales, including phonemic, syllabic, and word scales. Typically, we think of decoding speech in the service of recognizing words as a process that maps from phonemic units to words. Recent neurophysiological evidence, however, has highlighted the relevance of syllable-sized chunks for segmenting speech. Is there more to recognizing spoken language? We provide neural evidence for brain network dynamics that support an interaction of lexical with syllable-level processing. We identify cortical networks that differ depending on whether lexical-semantic information versus low-level syllable-transition information is processed. Word- and syllable-level processing interact within MTG and STG. The data enrich our understanding of comprehension by implicating a mapping from syllabic to lexical representations.

2020 ◽  
Author(s):  
Soheila Samiee ◽  
Dominique Vuvan ◽  
Esther Florin ◽  
Philippe Albouy ◽  
Isabelle Peretz ◽  
...  

AbstractThe detection of pitch changes is crucial to sound localization, music appreciation and speech comprehension, yet the brain network oscillatory dynamics involved remain unclear. We used time-resolved cortical imaging in a pitch change detection task. Tone sequences were presented to both typical listeners and participants affected with congenital amusia, as a model of altered pitch change perception.Our data show that tone sequences entrained slow (2-4 Hz) oscillations in the auditory cortex and inferior frontal gyrus, at the pace of tone presentations. Inter-regional signaling at this slow pace was directed from auditory cortex towards the inferior frontal gyrus and motor cortex. Bursts of faster (15-35Hz) oscillations were also generated in these regions, with directed influence from the motor cortex. These faster components occurred precisely at the expected latencies of each tone in a sequence, yielding a form of local phase-amplitude coupling with slower concurrent activity. The intensity of this coupling peaked dynamically at the moment of anticipated pitch changes.We clarify the mechanistic relevance of these observations in relation to behavior as, by task design, typical listeners outperformed amusic participants. Compared to typical listeners, inter-regional slow signaling toward motor and inferior frontal cortices was depressed in amusia. Also, the auditory cortex of amusic participants over-expressed tonic, fast-slow phase-amplitude coupling, pointing at a possible misalignment between stimulus encoding and internal predictive signaling. Our study provides novel insight into the functional architecture of polyrhythmic brain activity in auditory perception and emphasizes active, network processes involving the motor system in sensory integration.


2018 ◽  
Author(s):  
Mohsen Alavash ◽  
Sarah Tune ◽  
Jonas Obleser

AbstractSpeech comprehension in noisy, multi-talker situations poses a challenge. Human listeners differ substantially in the degree to which they adapt behaviorally and can listen successfully under such circumstances. How cortical networks embody this adaptation, particularly at the individual level, is currently unknown. We here explain this adaptation from reconfiguration of brain networks for a challenging listening task (i.e., a novel linguistic variant of the Posner paradigm with concurrent speech) in an age-varying sample of N = 49 healthy adults undergoing resting-state and task fMRI. We here provide evidence for the hypothesis that more successful listeners exhibit stronger task-specific reconfiguration, hence better adaptation, of brain networks. From rest to task, brain networks become reconfigured towards more localized cortical processing characterized by higher topological segregation. This reconfiguration is dominated by the functional division of an auditory and a cingulo-opercular module, and the emergence of a conjoined auditory and ventral attention module along bilateral middle and posterior temporal cortices. Supporting our hypothesis, the degree to which modularity of this fronto-temporal auditory-control network is increased relative to resting state predicts individuals’ listening success in states of divided and selective attention. Our findings elucidate how fine-tuned cortical communication dynamics shape selection and comprehension of speech. Our results highlight modularity of the auditory-control network as a key organizational principle in cortical implementation of auditory spatial attention in challenging listening situations.Significance StatementHow do brain networks shape our listening behavior? We here develop and test the hypothesis that, during challenging listening situations, intrinsic brain networks are reconfigured to adapt to the listening demands, and thus to enable successful listening. We find that, relative to a task-free resting state, networks of the listening brain show higher segregation of temporal auditory, ventral attention, and frontal control regions known to be involved in speech processing, sound localization, and effortful listening. Importantly, the relative change in modularity of this auditory-control network predicts individuals’ listening success. Our findings shed light on how cortical communication dynamics tune selection and comprehension of speech in challenging listening situations, and suggest modularity as the network principle of auditory spatial attention.


2006 ◽  
Vol 96 (1) ◽  
pp. 252-258 ◽  
Author(s):  
Rajiv Narayan ◽  
Gilberto Graña ◽  
Kamal Sen

Understanding how single cortical neurons discriminate between sensory stimuli is fundamental to providing a link between cortical neural responses and perception. The discrimination of sensory stimuli by cortical neurons has been intensively investigated in the visual and somatosensory systems. However, relatively little is known about discrimination of sounds by auditory cortical neurons. Auditory cortex plays a particularly important role in the discrimination of complex sounds, e.g., vocal communication sounds. The rich dynamic structure of such complex sounds on multiple time scales motivates two questions regarding cortical discrimination. How does discrimination depend on the temporal resolution of the cortical response? How does discrimination accuracy evolve over time? Here we investigate these questions in field L, the analogue of primary auditory cortex in zebra finches, analyzing temporal resolution and temporal integration in the discrimination of conspecific songs (songs of the bird's own species) for both anesthetized and awake subjects. We demonstrate the existence of distinct time scales for temporal resolution and temporal integration and explain how they arise from cortical neural responses to complex dynamic sounds.


2018 ◽  
Author(s):  
Yan Liang ◽  
◽  
Daniele J. Cherniak ◽  
Chenguang Sun

2021 ◽  
Vol 383 (1) ◽  
pp. 143-148
Author(s):  
Shadi Jafari ◽  
Mattias Alenius

AbstractOlfactory perception is very individualized in humans and also in Drosophila. The process that individualize olfaction is adaptation that across multiple time scales and mechanisms shape perception and olfactory-guided behaviors. Olfactory adaptation occurs both in the central nervous system and in the periphery. Central adaptation occurs at the level of the circuits that process olfactory inputs from the periphery where it can integrate inputs from other senses, metabolic states, and stress. We will here focus on the periphery and how the fast, slow, and persistent (lifelong) adaptation mechanisms in the olfactory sensory neurons individualize the Drosophila olfactory system.


2019 ◽  
Vol 11 (4) ◽  
pp. 1163 ◽  
Author(s):  
Melissa Bedinger ◽  
Lindsay Beevers ◽  
Lila Collet ◽  
Annie Visser

Climate change is a product of the Anthropocene, and the human–nature system in which we live. Effective climate change adaptation requires that we acknowledge this complexity. Theoretical literature on sustainability transitions has highlighted this and called for deeper acknowledgment of systems complexity in our research practices. Are we heeding these calls for ‘systems’ research? We used hydrohazards (floods and droughts) as an example research area to explore this question. We first distilled existing challenges for complex human–nature systems into six central concepts: Uncertainty, multiple spatial scales, multiple time scales, multimethod approaches, human–nature dimensions, and interactions. We then performed a systematic assessment of 737 articles to examine patterns in what methods are used and how these cover the complexity concepts. In general, results showed that many papers do not reference any of the complexity concepts, and no existing approach addresses all six. We used the detailed results to guide advancement from theoretical calls for action to specific next steps. Future research priorities include the development of methods for consideration of multiple hazards; for the study of interactions, particularly in linking the short- to medium-term time scales; to reduce data-intensivity; and to better integrate bottom–up and top–down approaches in a way that connects local context with higher-level decision-making. Overall this paper serves to build a shared conceptualisation of human–nature system complexity, map current practice, and navigate a complexity-smart trajectory for future research.


2021 ◽  
Vol 40 (9) ◽  
pp. 2139-2154
Author(s):  
Caroline E. Weibull ◽  
Paul C. Lambert ◽  
Sandra Eloranta ◽  
Therese M. L. Andersson ◽  
Paul W. Dickman ◽  
...  

Nanomaterials ◽  
2021 ◽  
Vol 11 (6) ◽  
pp. 1392
Author(s):  
David Gallina ◽  
G. M. Pastor

Structural disorder has been shown to be responsible for profound changes of the interaction-energy landscapes and collective dynamics of two-dimensional (2D) magnetic nanostructures. Weakly-disordered 2D ensembles have a few particularly stable magnetic configurations with large basins of attraction from which the higher-energy metastable configurations are separated by only small downward barriers. In contrast, strongly-disordered ensembles have rough energy landscapes with a large number of low-energy local minima separated by relatively large energy barriers. Consequently, the former show good-structure-seeker behavior with an unhindered relaxation dynamics that is funnelled towards the global minimum, whereas the latter show a time evolution involving multiple time scales and trapping which is reminiscent of glasses. Although these general trends have been clearly established, a detailed assessment of the extent of these effects in specific nanostructure realizations remains elusive. The present study quantifies the disorder-induced changes in the interaction-energy landscape of two-dimensional dipole-coupled magnetic nanoparticles as a function of the magnetic configuration of the ensembles. Representative examples of weakly-disordered square-lattice arrangements, showing good structure-seeker behavior, and of strongly-disordered arrangements, showing spin-glass-like behavior, are considered. The topology of the kinetic networks of metastable magnetic configurations is analyzed. The consequences of disorder on the morphology of the interaction-energy landscapes are revealed by contrasting the corresponding disconnectivity graphs. The correlations between the characteristics of the energy landscapes and the Markovian dynamics of the various magnetic nanostructures are quantified by calculating the field-free relaxation time evolution after either magnetic saturation or thermal quenching and by comparing them with the corresponding averages over a large number of structural arrangements. Common trends and system-specific features are identified and discussed.


Sign in / Sign up

Export Citation Format

Share Document