Multisensory Integration as a Window into Orderly and Disrupted Cognition and Communication

2020 ◽  
Vol 71 (1) ◽  
pp. 193-219 ◽  
Author(s):  
Mark T. Wallace ◽  
Tiffany G. Woynaroski ◽  
Ryan A. Stevenson

During our everyday lives, we are confronted with a vast amount of information from several sensory modalities. This multisensory information needs to be appropriately integrated for us to effectively engage with and learn from our world. Research carried out over the last half century has provided new insights into the way such multisensory processing improves human performance and perception; the neurophysiological foundations of multisensory function; the time course for its development; how multisensory abilities differ in clinical populations; and, most recently, the links between multisensory processing and cognitive abilities. This review summarizes the extant literature on multisensory function in typical and atypical circumstances, discusses the implications of the work carried out to date for theory and research, and points toward next steps for advancing the field.

2015 ◽  
Vol 28 (1-2) ◽  
pp. 71-99 ◽  
Author(s):  
Monica Gori

During the first years of life, sensory modalities communicate with each other. This process is fundamental for the development of unisensory and multisensory skills. The absence of one sensory input impacts on the development of other modalities. Since 2008 we have studied these aspects and developed our cross-sensory calibration theory. This theory emerged from the observation that children start to integrate multisensory information (such as vision and touch) only after 8–10 years of age. Before this age the more accurate sense teaches (calibrates) the others; when one calibrating modality is missing, the other modalities result impaired. Children with visual disability have problems in understanding the haptic or auditory perception of space and children with motor disabilities have problems in understanding the visual dimension of objects. This review presents our recent studies on multisensory integration and cross-sensory calibration in children and adults with and without sensory and motor disabilities. The goal of this review is to show the importance of interaction between sensory systems during the early period of life in order to correct perceptual development to occur.


2021 ◽  
Vol 4 (3) ◽  
pp. 53
Author(s):  
Yi Peng Toh ◽  
Emilie Dion ◽  
Antónia Monteiro

Butterflies possess impressive cognitive abilities, and investigations into the neural mechanisms underlying these abilities are increasingly being conducted. Exploring butterfly neurobiology may require the isolation of larval, pupal, and/or adult brains for further molecular and histological experiments. This procedure has been largely described in the fruit fly, but a detailed description of butterfly brain dissections is still lacking. Here, we provide a detailed written and video protocol for the removal of Bicyclus anynana adult, pupal, and larval brains. This species is gradually becoming a popular model because it uses a large set of sensory modalities, displays plastic and hormonally controlled courtship behaviour, and learns visual mate preference and olfactory preferences that can be passed on to its offspring. The extracted brain can be used for downstream analyses, such as immunostaining, DNA or RNA extraction, and the procedure can be easily adapted to other lepidopteran species and life stages.


2002 ◽  
Vol 88 (1) ◽  
pp. 540-543 ◽  
Author(s):  
John J. Foxe ◽  
Glenn R. Wylie ◽  
Antigona Martinez ◽  
Charles E. Schroeder ◽  
Daniel C. Javitt ◽  
...  

Using high-field (3 Tesla) functional magnetic resonance imaging (fMRI), we demonstrate that auditory and somatosensory inputs converge in a subregion of human auditory cortex along the superior temporal gyrus. Further, simultaneous stimulation in both sensory modalities resulted in activity exceeding that predicted by summing the responses to the unisensory inputs, thereby showing multisensory integration in this convergence region. Recently, intracranial recordings in macaque monkeys have shown similar auditory-somatosensory convergence in a subregion of auditory cortex directly caudomedial to primary auditory cortex (area CM). The multisensory region identified in the present investigation may be the human homologue of CM. Our finding of auditory-somatosensory convergence in early auditory cortices contributes to mounting evidence for multisensory integration early in the cortical processing hierarchy, in brain regions that were previously assumed to be unisensory.


2020 ◽  
Vol 30 (8) ◽  
pp. 4410-4423
Author(s):  
You Li ◽  
Carol Seger ◽  
Qi Chen ◽  
Lei Mo

Abstract Humans are able to categorize things they encounter in the world (e.g., a cat) by integrating multisensory information from the auditory and visual modalities with ease and speed. However, how the brain learns multisensory categories remains elusive. The present study used functional magnetic resonance imaging to investigate, for the first time, the neural mechanisms underpinning multisensory information-integration (II) category learning. A sensory-modality-general network, including the left insula, right inferior frontal gyrus (IFG), supplementary motor area, left precentral gyrus, bilateral parietal cortex, and right caudate and globus pallidus, was recruited for II categorization, regardless of whether the information came from a single modality or from multiple modalities. Putamen activity was higher in correct categorization than incorrect categorization. Critically, the left IFG and left body and tail of the caudate were activated in multisensory II categorization but not in unisensory II categorization, which suggests this network plays a specific role in integrating multisensory information during category learning. The present results extend our understanding of the role of the left IFG in multisensory processing from the linguistic domain to a broader role in audiovisual learning.


1994 ◽  
Vol 71 (1) ◽  
pp. 429-432 ◽  
Author(s):  
M. T. Wallace ◽  
B. E. Stein

1. The synthesis of information from different sensory modalities in the superior colliculus is an important precursor of attentive and orientation behavior. 2. This integration of multisensory information is critically dependent on inputs from a small area of association cortex, the anterior ectosylvian sulcus. Removal of these corticotectal influences can have a remarkably specific effect: it can eliminate multisensory integration in superior colliculus neurons while leaving their responses to unimodal cues intact. 3. Apparently, some of the associative functions of cortex are accomplished via its target neurons in the midbrain.


2012 ◽  
Vol 25 (0) ◽  
pp. 198
Author(s):  
Manuel R. Mercier ◽  
John J. Foxe ◽  
Ian C. Fiebelkorn ◽  
John S. Butler ◽  
Theodore H. Schwartz ◽  
...  

Investigations have traditionally focused on activity in the sensory cortices as a function of their respective sensory inputs. However, converging evidence from multisensory research has shown that neural activity in a given sensory region can be modulated by stimulation of other so-called ancillary sensory systems. Both electrophysiology and functional imaging support the occurrence of multisensory processing in human sensory cortex based on the latency of multisensory effects and their precise anatomical localization. Still, due to inherent methodological limitations, direct evidence of the precise mechanisms by which multisensory integration occurs within human sensory cortices is lacking. Using intracranial recordings in epileptic patients () undergoing presurgical evaluation, we investigated the neurophysiological basis of multisensory integration in visual cortex. Subdural electrical brain activity was recorded while patients performed a simple detection task of randomly ordered Auditory alone (A), Visual alone (V) and Audio–Visual stimuli (AV). We then performed time-frequency analysis: first we investigated each condition separately to evaluate responses compared to baseline, then we indexed multisensory integration using both the maximum criterion model (AV vs. V) and the additive model (AV vs. A+V). Our results show that auditory input significantly modulates neuronal activity in visual cortex by resetting the phase of ongoing oscillatory activity. This in turn leads to multisensory integration when auditory and visual stimuli are simultaneously presented.


2020 ◽  
Vol 375 (1802) ◽  
pp. 20190467 ◽  
Author(s):  
Sara E. Miller ◽  
Michael J. Sheehan ◽  
H. Kern Reeve

Social interactions are mediated by recognition systems, meaning that the cognitive abilities or phenotypic diversity that facilitate recognition may be common targets of social selection. Recognition occurs when a receiver compares the phenotypes produced by a sender with a template. Coevolution between sender and receiver traits has been empirically reported in multiple species and sensory modalities, though the dynamics and relative exaggeration of traits from senders versus receivers have received little attention. Here, we present a coevolutionary dynamic model that examines the conditions under which senders and receivers should invest effort in facilitating individual recognition. The model predicts coevolution of sender and receiver traits, with the equilibrium investment dependent on the relative costs of signal production versus cognition. In order for recognition to evolve, initial sender and receiver trait values must be above a threshold, suggesting that recognition requires some degree of pre-existing diversity and cognitive abilities. The analysis of selection gradients demonstrates that the strength of selection on sender signals and receiver cognition is strongest when the trait values are furthest from the optima. The model provides new insights into the expected strength and dynamics of selection during the origin and elaboration of individual recognition, an important feature of social cognition in many taxa. This article is part of the theme issue ‘Signal detection theory in recognition systems: from evolving models to experimental tests’.


2018 ◽  
Author(s):  
Gareth Harris ◽  
Taihong Wu ◽  
Gaia Linfield ◽  
Myung-Kyu Choi ◽  
He Liu ◽  
...  

AbstractIn the natural environment, animals often encounter multiple sensory cues that are simultaneously present. The nervous system integrates the relevant sensory information to generate behavioral responses that have adaptive values. However, the signal transduction pathways and the molecules that regulate integrated behavioral response to multiple sensory cues are not well defined. Here, we characterize a collective modulatory basis for a behavioral decision in C. elegans when the animal is presented with an attractive food source together with a repulsive odorant. We show that distributed neuronal components in the worm nervous system and several neuromodulators orchestrate the decision-making process, suggesting that various states and contexts may modulate the multisensory integration. Among these modulators, we identify a new function of a conserved TGF-β pathway that regulates the integrated decision by inhibiting the signaling from a set of central neurons. Interestingly, we find that a common set of modulators, including the TGF-β pathway, regulate the integrated response to the pairing of different foods and repellents. Together, our results provide insights into the modulatory signals regulating multisensory integration and reveal potential mechanistic basis for the complex pathology underlying defects in multisensory processing shared by common neurological diseases.Author SummaryThe present study characterizes the modulation of a behavioral decision in C. elegans when the worm is presented with a food lawn that is paired with a repulsive smell. We show that multiple sensory neurons and interneurons play roles in making the decision. We also identify several modulatory molecules that are essential for the integrated decision when the animal faces a choice between the cues of opposing valence. We further show that many of these factors, which often represent different states and contexts, are common for behavioral decisions that integrate sensory information from different types of foods and repellents. Overall, our results reveal a collective molecular and cellular basis for integration of simultaneously present attractive and repulsive cues to fine-tune decision-making.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Xiaoxuan Jia ◽  
Ha Hong ◽  
Jim DiCarlo

Temporal continuity of object identity is a feature of natural visual input, and is potentially exploited -- in an unsupervised manner -- by the ventral visual stream to build the neural representation in inferior temporal (IT) cortex. Here we investigated whether plasticity of individual IT neurons underlies human core-object-recognition behavioral changes induced with unsupervised visual experience. We built a single-neuron plasticity model combined with a previously established IT population-to-recognition-behavior linking model to predict human learning effects. We found that our model, after constrained by neurophysiological data, largely predicted the mean direction, magnitude and time course of human performance changes. We also found a previously unreported dependency of the observed human performance change on the initial task difficulty. This result adds support to the hypothesis that tolerant core object recognition in human and non-human primates is instructed -- at least in part -- by naturally occurring unsupervised temporal contiguity experience.


2019 ◽  
Author(s):  
Michael J. Crosse ◽  
John J. Foxe ◽  
Sophie Molholm

AbstractChildren with autism spectrum disorder (ASD) are often impaired in their ability to cope with and process multisensory information, which may contribute to some of the social and communicative deficits that are prevalent in this population. Amelioration of such deficits in adolescence has been observed for ecologically-relevant stimuli such as speech. However, it is not yet known if this recovery generalizes to the processing of nonsocial stimuli such as more basic beeps and flashes, typically used in cognitive neuroscience research. We hypothesize that engagement of different neural processes and lack of environmental exposure to such artificial stimuli leads to protracted developmental trajectories in both neurotypical (NT) individuals and individuals with ASD, thus delaying the age at which we observe this “catch up”. Here, we test this hypothesis using a bisensory detection task by measuring human response times to randomly presented auditory, visual and audiovisual stimuli. By measuring the behavioral gain afforded by an audiovisual signal, we show that the multisensory deficit previously reported in children with ASD recovers in adulthood by the mid-twenties. In addition, we examine the effects of switching between sensory modalities and show that teenagers with ASD incur less of a behavioral cost than their NT peers. Computational modelling reveals that multisensory information interacts according to different rules in children and adults, and that sensory evidence is weighted differently too. In ASD, weighting of sensory information and allocation of attention during multisensory processing differs to that of NT individuals. Based on our findings, we propose a theoretical framework of multisensory development in NT and ASD individuals.


Sign in / Sign up

Export Citation Format

Share Document