Acoustic Noise Improves Visual Perception and Modulates Occipital Oscillatory States

2014 ◽  
Vol 26 (4) ◽  
pp. 699-711 ◽  
Author(s):  
Stephanie Gleiss ◽  
Christoph Kayser

Perception is a multisensory process, and previous work has shown that multisensory interactions occur not only for object-related stimuli but also for simplistic and apparently unrelated inputs to the different senses. We here compare the facilitation of visual perception induced by transient (target-synchronized) sounds to the facilitation provided by continuous background noise like sounds. Specifically, we show that continuous acoustic noise improves visual contrast detection by systematically shifting psychometric curves in an amplitude-dependent manner. This multisensory benefit was found to be both qualitatively and quantitatively similar to that induced by a transient and target synchronized sound in the same paradigm. Studying the underlying neural mechanisms using electric neuroimaging (EEG), we found that acoustic noise alters occipital alpha (8–12 Hz) power and decreases beta-band (14–20 Hz) coupling of occipital and temporal sites. Task-irrelevant and continuous sounds thereby have an amplitude-dependent effect on cortical mechanisms implicated in shaping visual cortical excitability. The same oscillatory mechanisms also mediate visual facilitation by transient sounds, and our results suggest that task-related sounds and task-irrelevant background noises could induce perceptually and mechanistically similar enhancement of visual perception. Given the omnipresence of sounds and noises in our environment, such multisensory interactions may affect perception in many everyday scenarios.

Cephalalgia ◽  
2000 ◽  
Vol 20 (2) ◽  
pp. 74-84 ◽  
Author(s):  
SL McColl ◽  
F Wilkinson

The present study examined the extent to which migraineurs demonstrate interictal visual cortical hyperexcitability as a result of poor inhibitory control in the visual system. We employed a well-established psychophysical measure of inhibition, visual contrast gain control. The task involved detecting a briefly presented target that was superimposed on a highly excitable high contrast masking pattern. The strength of inhibition was assessed by comparing target detection thresholds with and without the operation of gain controls. Migraineurs with and without aura ( n = 25, n = 22, respectively) were compared with those with no history of migraine ( n = 25). Our results do not indicate a loss of inhibition in migraine; the strength of inhibitory feedback contrast gain controls was similar between migraineurs and controls. We did however, find a statistically greater masking effect in migraineurs compared with controls in the zero delay condition, suggesting cortical hyperexcitability in migraine. Possible mechanisms of cortical hyperexcitability are discussed in light of the results.


2013 ◽  
Vol 25 (5) ◽  
pp. 685-696 ◽  
Author(s):  
Silvia Convento ◽  
Giuseppe Vallar ◽  
Chiara Galantini ◽  
Nadia Bolognini

Merging information derived from different sensory channels allows the brain to amplify minimal signals to reduce their ambiguity, thereby improving the ability of orienting to, detecting, and identifying environmental events. Although multisensory interactions have been mostly ascribed to the activity of higher-order heteromodal areas, multisensory convergence may arise even in primary sensory-specific areas located very early along the cortical processing stream. In three experiments, we investigated early multisensory interactions in lower-level visual areas, by using a novel approach, based on the coupling of behavioral stimulation with two noninvasive brain stimulation techniques, namely, TMS and transcranial direct current stimulation (tDCS). First, we showed that redundant multisensory stimuli can increase visual cortical excitability, as measured by means of phosphene induction by occipital TMS; such physiological enhancement is followed by a behavioral facilitation through the amplification of signal intensity in sensory-specific visual areas. The more sensory inputs are combined (i.e., trimodal vs. bimodal stimuli), the greater are the benefits on phosphene perception. Second, neuroelectrical activity changes induced by tDCS in the temporal and in the parietal cortices, but not in the occipital cortex, can further boost the multisensory enhancement of visual cortical excitability, by increasing the auditory and tactile inputs from temporal and parietal regions, respectively, to lower-level visual areas.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Hisato Nakazono ◽  
Katsuya Ogata ◽  
Akinori Takeda ◽  
Emi Yamada ◽  
Shinichiro Oka ◽  
...  

AbstractTranscranial alternating current stimulation (tACS) at 20 Hz (β) has been shown to modulate motor evoked potentials (MEPs) when paired with transcranial magnetic stimulation (TMS) in a phase-dependent manner. Repetitive paired-pulse TMS (rPPS) with I-wave periodicity (1.5 ms) induced short-lived facilitation of MEPs. We hypothesized that tACS would modulate the facilitatory effects of rPPS in a frequency- and phase-dependent manner. To test our hypothesis, we investigated the effects of combined tACS and rPPS. We applied rPPS in combination with peak or trough phase tACS at 10 Hz (α) or β, or sham tACS (rPPS alone). The facilitatory effects of rPPS in the sham condition were temporary and variable among participants. In the β tACS peak condition, significant increases in single-pulse MEPs persisted for over 30 min after the stimulation, and this effect was stable across participants. In contrast, β tACS in the trough condition did not modulate MEPs. Further, α tACS parameters did not affect single-pulse MEPs after the intervention. These results suggest that a rPPS-induced increase in trans-synaptic efficacy could be strengthened depending on the β tACS phase, and that this technique could produce long-lasting plasticity with respect to cortical excitability.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Paul VanGilder ◽  
Ying Shi ◽  
Gregory Apker ◽  
Christopher A. Buneo

AbstractAlthough multisensory integration is crucial for sensorimotor function, it is unclear how visual and proprioceptive sensory cues are combined in the brain during motor behaviors. Here we characterized the effects of multisensory interactions on local field potential (LFP) activity obtained from the superior parietal lobule (SPL) as non-human primates performed a reaching task with either unimodal (proprioceptive) or bimodal (visual-proprioceptive) sensory feedback. Based on previous analyses of spiking activity, we hypothesized that evoked LFP responses would be tuned to arm location but would be suppressed on bimodal trials, relative to unimodal trials. We also expected to see a substantial number of recording sites with enhanced beta band spectral power for only one set of feedback conditions (e.g. unimodal or bimodal), as was previously observed for spiking activity. We found that evoked activity and beta band power were tuned to arm location at many individual sites, though this tuning often differed between unimodal and bimodal trials. Across the population, both evoked and beta activity were consistent with feedback-dependent tuning to arm location, while beta band activity also showed evidence of response suppression on bimodal trials. The results suggest that multisensory interactions can alter the tuning and gain of arm position-related LFP activity in the SPL.


1973 ◽  
Vol 5 (1) ◽  
pp. 1-13 ◽  
Author(s):  
Leo M. Chalupa ◽  
William S. Battersby ◽  
Thomas E. Frumkes

2018 ◽  
Vol 119 (2) ◽  
pp. 380-388 ◽  
Author(s):  
Alice Tomassini ◽  
Alessandro D’Ausilio

Movement planning and execution rely on the anticipation and online control of the incoming sensory input. Evidence suggests that sensorimotor processes may synchronize visual rhythmic activity in preparation of action performance. Indeed, we recently reported periodic fluctuations of visual contrast sensitivity that are time-locked to the onset of an intended movement of the arm. However, the origin of the observed visual modulations has so far remained unclear because of the endogenous (and thus temporally undetermined) activation of the sensorimotor system that is associated with voluntary movement initiation. In this study, we activated the sensorimotor circuitry involved in the hand control in an exogenous and controlled way by means of peripheral stimulation of the median nerve and characterized the spectrotemporal dynamics of the ensuing visual perception. The stimulation of the median nerve triggers robust and long-lasting (∼1 s) alpha-band oscillations in visual perception, whose strength is temporally modulated in a way that is consistent with the changes in alpha power described at the neurophysiological level after sensorimotor stimulation. These findings provide evidence in support of a causal role of the sensorimotor system in modulating oscillatory activity in visual areas with consequences for visual perception. NEW & NOTEWORTHY This study shows that the peripheral activation of the somatomotor hand system triggers long-lasting alpha periodicity in visual perception. This demonstrates that not only the endogenous sensorimotor processes involved in movement preparation but also the passive stimulation of the sensorimotor system can synchronize visual activity. The present work suggests that oscillation-based mechanisms may subserve core (task independent) sensorimotor integration functions.


2021 ◽  
Vol 12 ◽  
Author(s):  
Andrea Ghiani ◽  
Marcello Maniglia ◽  
Luca Battaglini ◽  
David Melcher ◽  
Luca Ronconi

Neurophysiological studies in humans employing magneto- (MEG) and electro- (EEG) encephalography increasingly suggest that oscillatory rhythmic activity of the brain may be a core mechanism for binding sensory information across space, time, and object features to generate a unified perceptual representation. To distinguish whether oscillatory activity is causally related to binding processes or whether, on the contrary, it is a mere epiphenomenon, one possibility is to employ neuromodulatory techniques such as transcranial alternating current stimulation (tACS). tACS has seen a rising interest due to its ability to modulate brain oscillations in a frequency-dependent manner. In the present review, we critically summarize current tACS evidence for a causal role of oscillatory activity in spatial, temporal, and feature binding in the context of visual perception. For temporal binding, the emerging picture supports a causal link with the power and the frequency of occipital alpha rhythms (8–12 Hz); however, there is no consistent evidence on the causal role of the phase of occipital tACS. For feature binding, the only study available showed a modulation by occipital alpha tACS. The majority of studies that successfully modulated oscillatory activity and behavioral performance in spatial binding targeted parietal areas, with the main rhythms causally linked being the theta (~7 Hz) and beta (~18 Hz) frequency bands. On the other hand, spatio-temporal binding has been directly modulated by parieto-occipital gamma (~40–60 Hz) and alpha (10 Hz) tACS, suggesting a potential role of cross-frequency coupling when binding across space and time. Nonetheless, negative or partial results have also been observed, suggesting methodological limitations that should be addressed in future research. Overall, the emerging picture seems to support a causal role of brain oscillations in binding processes and, consequently, a certain degree of plasticity for shaping binding mechanisms in visual perception, which, if proved to have long lasting effects, can find applications in different clinical populations.


2019 ◽  
Vol 14 (7) ◽  
pp. 727-735 ◽  
Author(s):  
Annett Schirmer ◽  
Maria Wijaya ◽  
Esther Wu ◽  
Trevor B Penney

Abstract This pre-registered event-related potential study explored how vocal emotions shape visual perception as a function of attention and listener sex. Visual task displays occurred in silence or with a neutral or an angry voice. Voices were task-irrelevant in a single-task block, but had to be categorized by speaker sex in a dual-task block. In the single task, angry voices increased the occipital N2 component relative to neutral voices in women, but not men. In the dual task, angry voices relative to neutral voices increased occipital N1 and N2 components, as well as accuracy, in women and marginally decreased accuracy in men. Thus, in women, vocal anger produced a strong, multifaceted visual enhancement comprising attention-dependent and attention-independent processes, whereas in men, it produced a small, behavior-focused visual processing impairment that was strictly attention-dependent. In sum, these data indicate that attention and listener sex critically modulate whether and how vocal emotions shape visual perception.


1995 ◽  
Vol 73 (4) ◽  
pp. 1341-1354 ◽  
Author(s):  
G. Sary ◽  
R. Vogels ◽  
G. Kovacs ◽  
G. A. Orban

1. We recorded from neurons responsive to gratings in the inferior temporal (IT) cortices of macaque monkeys. One of the monkeys performed an orientation discrimination task; the other maintained fixation during stimulus presentation. Stimuli consisted of gratings based on discontinuities in luminance, relative motion, and texture. 2. IT cells responded well to gratings defined solely by relative motion, implying either direct or indirect motion input into IT, an area that is part of the ventral visual cortical pathway. 3. Response strength in general did not depend on the cue used to define the gratings. Latency values observed for the two static grating types (luminance- and texture-defined gratings) were similar, but significantly shorter than those measured for the kinetic gratings. 4. Stimulus orientation had a significant effect in 27%, 27%, and 9% of the cells tested with luminance-, kinetic-, and texture-defined gratings, respectively. 5. Only a small proportion of cells were orientation sensitive for more than one defining cue. The average preferred orientation for luminance and kinetic gratings matched; the tuning width was similar for the two cues. 6. Our results indicate that IT cells may contribute to cue-invariant coding of boundaries and edges. We discuss the relevance of these results to visual perception.


2013 ◽  
Vol 26 (5) ◽  
pp. 483-502 ◽  
Author(s):  
Antonia Thelen ◽  
Micah M. Murray

This review article summarizes evidence that multisensory experiences at one point in time have long-lasting effects on subsequent unisensory visual and auditory object recognition. The efficacy of single-trial exposure to task-irrelevant multisensory events is its ability to modulate memory performance and brain activity to unisensory components of these events presented later in time. Object recognition (either visual or auditory) is enhanced if the initial multisensory experience had been semantically congruent and can be impaired if this multisensory pairing was either semantically incongruent or entailed meaningless information in the task-irrelevant modality, when compared to objects encountered exclusively in a unisensory context. Processes active during encoding cannot straightforwardly explain these effects; performance on all initial presentations was indistinguishable despite leading to opposing effects with stimulus repetitions. Brain responses to unisensory stimulus repetitions differ during early processing stages (∼100 ms post-stimulus onset) according to whether or not they had been initially paired in a multisensory context. Plus, the network exhibiting differential responses varies according to whether or not memory performance is enhanced or impaired. The collective findings we review indicate that multisensory associations formedviasingle-trial learning exert influences on later unisensory processing to promote distinct object representations that manifest as differentiable brain networks whose activity is correlated with memory performance. These influences occur incidentally, despite many intervening stimuli, and are distinguishable from the encoding/learning processes during the formation of the multisensory associations. The consequences of multisensory interactions thus persist over time to impact memory retrieval and object discrimination.


Sign in / Sign up

Export Citation Format

Share Document