scholarly journals Attention controls multisensory perception via 2 distinct mechanisms at different levels of the cortical hierarchy

PLoS Biology ◽  
2021 ◽  
Vol 19 (11) ◽  
pp. e3001465
Author(s):  
Ambra Ferrari ◽  
Uta Noppeney

To form a percept of the multisensory world, the brain needs to integrate signals from common sources weighted by their reliabilities and segregate those from independent sources. Previously, we have shown that anterior parietal cortices combine sensory signals into representations that take into account the signals’ causal structure (i.e., common versus independent sources) and their sensory reliabilities as predicted by Bayesian causal inference. The current study asks to what extent and how attentional mechanisms can actively control how sensory signals are combined for perceptual inference. In a pre- and postcueing paradigm, we presented observers with audiovisual signals at variable spatial disparities. Observers were precued to attend to auditory or visual modalities prior to stimulus presentation and postcued to report their perceived auditory or visual location. Combining psychophysics, functional magnetic resonance imaging (fMRI), and Bayesian modelling, we demonstrate that the brain moulds multisensory inference via 2 distinct mechanisms. Prestimulus attention to vision enhances the reliability and influence of visual inputs on spatial representations in visual and posterior parietal cortices. Poststimulus report determines how parietal cortices flexibly combine sensory estimates into spatial representations consistent with Bayesian causal inference. Our results show that distinct neural mechanisms control how signals are combined for perceptual inference at different levels of the cortical hierarchy.

Neuroforum ◽  
2018 ◽  
Vol 24 (4) ◽  
pp. A169-A181
Author(s):  
Uta Noppeney ◽  
Samuel A. Jones ◽  
Tim Rohe ◽  
Ambra Ferrari

Abstarct Our senses are constantly bombarded with a myriad of signals. To make sense of this cacophony, the brain needs to integrate signals emanating from a common source, but segregate signals originating from the different sources. Thus, multisensory perception relies critically on inferring the world’s causal structure (i. e. one common vs. multiple independent sources). Behavioural research has shown that the brain arbitrates between sensory integration and segregation consistent with the principles of Bayesian Causal Inference. At the neural level, recent functional magnetic resonance imaging (fMRI) and electroencephalography (EEG) studies have shown that the brain accomplishes Bayesian Causal Inference by dynamically encoding multiple perceptual estimates across the sensory processing hierarchies. Only at the top of the hierarchy in anterior parietal cortices did the brain form perceptual estimates that take into account the observer’s uncertainty about the world’s causal structure consistent with Bayesian Causal Inference.


2018 ◽  
Author(s):  
Máté Aller ◽  
Uta Noppeney

AbstractTo form a percept of the environment, the brain needs to solve the binding problem – inferring whether signals come from a common cause and be integrated, or come from independent causes and be segregated. Behaviourally, humans solve this problem near-optimally as predicted by Bayesian Causal Inference; but, the neural mechanisms remain unclear. Combining Bayesian modelling, electroencephalography (EEG), and multivariate decoding in an audiovisual spatial localization task, we show that the brain accomplishes Bayesian Causal Inference by dynamically encoding multiple spatial estimates. Initially, auditory and visual signal locations are estimated independently; next, an estimate is formed that combines information from vision and audition. Yet, it is only from 200 ms onwards that the brain integrates audiovisual signals weighted by their bottom-up sensory reliabilities and top-down task-relevance into spatial priority maps that guide behavioural responses. Critically, as predicted by Bayesian Causal Inference, these spatial priority maps take into account the brain’s uncertainty about the world’s causal structure and flexibly arbitrate between sensory integration and segregation. The dynamic evolution of perceptual estimates thus reflects the hierarchical nature of Bayesian Causal Inference, a statistical computation, crucial for effective interactions with the environment.


2020 ◽  
Vol 33 (4-5) ◽  
pp. 383-416 ◽  
Author(s):  
Arianna Zuanazzi ◽  
Uta Noppeney

Abstract Attention (i.e., task relevance) and expectation (i.e., signal probability) are two critical top-down mechanisms guiding perceptual inference. Attention prioritizes processing of information that is relevant for observers’ current goals. Prior expectations encode the statistical structure of the environment. Research to date has mostly conflated spatial attention and expectation. Most notably, the Posner cueing paradigm manipulates spatial attention using probabilistic cues that indicate where the subsequent stimulus is likely to be presented. Only recently have studies attempted to dissociate the mechanisms of attention and expectation and characterized their interactive (i.e., synergistic) or additive influences on perception. In this review, we will first discuss methodological challenges that are involved in dissociating the mechanisms of attention and expectation. Second, we will review research that was designed to dissociate attention and expectation in the unisensory domain. Third, we will review the broad field of crossmodal endogenous and exogenous spatial attention that investigates the impact of attention across the senses. This raises the critical question of whether attention relies on amodal or modality-specific mechanisms. Fourth, we will discuss recent studies investigating the role of both spatial attention and expectation in multisensory perception, where the brain constructs a representation of the environment based on multiple sensory inputs. We conclude that spatial attention and expectation are closely intertwined in almost all circumstances of everyday life. Yet, despite their intimate relationship, attention and expectation rely on partly distinct neural mechanisms: while attentional resources are mainly shared across the senses, expectations can be formed in a modality-specific fashion.


2021 ◽  
Author(s):  
Jean-Paul Noel ◽  
Sabyasachi Shivkumar ◽  
Kalpana Dokka ◽  
Ralf Haefner ◽  
Dora Angelaki

Autism Spectrum Disorder (ASD) is characterized by a panoply of social, communicative, and sensory anomalies. As such, a central goal of computational psychiatry is to ascribe the heterogenous phenotypes observed in ASD to a limited set of canonical computations that may have gone awry in the disorder. Here, we posit causal inference – the process of inferring a causal structure linking sensory signals to hidden world causes – as one such computation. We show that (i) audio-visual integration is intact in ASD and in line with optimal models of cue combination, yet (ii) multisensory behavior is anomalous in ASD because they operate under an internal model favoring integration (vs. segregation). Paradoxically, during explicit reports of common cause across spatial or temporal disparities, individuals with ASD were less, and not more, likely to report common cause. Formal model fitting highlighted alterations in both the prior probability for common cause (p-common) and choice biases, which are dissociable in implicit, but not explicit causal inference tasks. Together, this pattern of results suggests (i) distinct internal models in attributing world causes to sensory signals in ASD relative to neurotypical individuals given identical sensory cues, and (ii) the presence of an explicit compensatory mechanism in ASD, with these individuals having learned to compensate for their bias to integrate in their explicit reports.


2018 ◽  
Author(s):  
Yinan Cao ◽  
Christopher Summerfield ◽  
Hame Park ◽  
Bruno L. Giordano ◽  
Christoph Kayser

When combining information across different senses humans need to flexibly select cues of a common origin whilst avoiding distraction from irrelevant inputs. The brain could solve this challenge using a hierarchical principle, by deriving rapidly a fused sensory estimate for computational expediency and, later and if required, filtering out irrelevant signals based on the inferred sensory cause(s). Analysing time- and source-resolved human magnetoencephalographic data we unveil a systematic spatio-temporal cascade of the relevant computations, starting with early segregated unisensory representations, continuing with sensory fusion in parietal-temporal regions and culminating as causal inference in the frontal lobe. Our results reconcile previous computational accounts of multisensory perception by showing that prefrontal cortex guides flexible integrative behaviour based on candidate representations established in sensory and association cortices, thereby framing multisensory integration in the generalised context of adaptive behaviour.


2018 ◽  
Author(s):  
Tim Rohe ◽  
Ann-Christine Ehlis ◽  
Uta Noppeney

AbstractTransforming the barrage of sensory signals into a coherent multisensory percept relies on solving the binding problem – deciding whether signals come from a common cause and should be integrated, or instead be segregated. Human observers typically arbitrate between integration and segregation consistent with Bayesian Causal Inference, but the neural mechanisms remain poorly understood. We presented observers with audiovisual sequences that varied in the number of flashes and beeps. Combining Bayesian modelling and EEG representational similarity analyses, we show that the brain initially represents the number of flashes and beeps and their numeric disparity mainly independently. Later, it computes them by averaging the forced-fusion and segregation estimates weighted by the probabilities of common and independent cause models (i.e. model averaging). Crucially, prestimulus oscillatory alpha power and phase correlate with observers’ prior beliefs about the world’s causal structure that guide their arbitration between sensory integration and segregation.


Author(s):  
Bruno and

Synaesthesia is a curious anomaly of multisensory perception. When presented with stimulation in one sensory channel, in addition to the percept usually associated with that channel (inducer) a true synaesthetic experiences a second percept in another perceptual modality (concurrent). Although synaesthesia is not pathological, true synaesthetes are relatively rare and their synaesthetic associations tend to be quite idiosyncratic. For this reason, studying synaesthesia is difficult, but exciting new experimental results are beginning to clarify what makes the brain of synaesthetes special and the mechanisms that may produce the condition. Even more importantly, the related phenomenon known as ‘natural’ crossmodal associations is instead experienced by everyone, providing another useful domain for studying multisensory interactions with important implications for understanding our preferences for products in terms of spontaneously evoked associations, as well as for choosing appropriate names, labels, and packaging in marketing applications.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Zakaria Djebbara ◽  
Lars Brorson Fich ◽  
Klaus Gramann

AbstractAction is a medium of collecting sensory information about the environment, which in turn is shaped by architectural affordances. Affordances characterize the fit between the physical structure of the body and capacities for movement and interaction with the environment, thus relying on sensorimotor processes associated with exploring the surroundings. Central to sensorimotor brain dynamics, the attentional mechanisms directing the gating function of sensory signals share neuronal resources with motor-related processes necessary to inferring the external causes of sensory signals. Such a predictive coding approach suggests that sensorimotor dynamics are sensitive to architectural affordances that support or suppress specific kinds of actions for an individual. However, how architectural affordances relate to the attentional mechanisms underlying the gating function for sensory signals remains unknown. Here we demonstrate that event-related desynchronization of alpha-band oscillations in parieto-occipital and medio-temporal regions covary with the architectural affordances. Source-level time–frequency analysis of data recorded in a motor-priming Mobile Brain/Body Imaging experiment revealed strong event-related desynchronization of the alpha band to originate from the posterior cingulate complex, the parahippocampal region as well as the occipital cortex. Our results firstly contribute to the understanding of how the brain resolves architectural affordances relevant to behaviour. Second, our results indicate that the alpha-band originating from the occipital cortex and parahippocampal region covaries with the architectural affordances before participants interact with the environment, whereas during the interaction, the posterior cingulate cortex and motor areas dynamically reflect the affordable behaviour. We conclude that the sensorimotor dynamics reflect behaviour-relevant features in the designed environment.


2020 ◽  
Vol 319 (3) ◽  
pp. R366-R375
Author(s):  
Hugo F. Posada-Quintero ◽  
Youngsun Kong ◽  
Kimberly Nguyen ◽  
Cara Tran ◽  
Luke Beardslee ◽  
...  

We have tested the feasibility of thermal grills, a harmless method to induce pain. The thermal grills consist of interlaced tubes that are set at cool or warm temperatures, creating a painful “illusion” (no tissue injury is caused) in the brain when the cool and warm stimuli are presented collectively. Advancement in objective pain assessment research is limited because the gold standard, the self-reporting pain scale, is highly subjective and only works for alert and cooperative patients. However, the main difficulty for pain studies is the potential harm caused to participants. We have recruited 23 subjects in whom we induced electric pulses and thermal grill (TG) stimulation. The TG effectively induced three different levels of pain, as evidenced by the visual analog scale (VAS) provided by the subjects after each stimulus. Furthermore, objective physiological measurements based on electrodermal activity showed a significant increase in levels as stimulation level increased. We found that VAS was highly correlated with the TG stimulation level. The TG stimulation safely elicited pain levels up to 9 out of 10. The TG stimulation allows for extending studies of pain to ranges of pain in which other stimuli are harmful.


2019 ◽  
Author(s):  
Ulrik Beierholm ◽  
Tim Rohe ◽  
Ambra Ferrari ◽  
Oliver Stegle ◽  
Uta Noppeney

AbstractTo form the most reliable percept of the environment, the brain needs to represent sensory uncertainty. Current theories of perceptual inference assume that the brain computes sensory uncertainty instantaneously and independently for each stimulus.In a series of psychophysics experiments human observers localized auditory signals that were presented in synchrony with spatially disparate visual signals. Critically, the visual noise changed dynamically over time with or without intermittent jumps. Our results show that observers integrate audiovisual inputs weighted by sensory reliability estimates that combine information from past and current signals as predicted by an optimal Bayesian learner or approximate strategies of exponential discountingOur results challenge classical models of perceptual inference where sensory uncertainty estimates depend only on the current stimulus. They demonstrate that the brain capitalizes on the temporal dynamics of the external world and estimates sensory uncertainty by combining past experiences with new incoming sensory signals.


Sign in / Sign up

Export Citation Format

Share Document