scholarly journals Spatial attention enhances cortical tracking of quasi-rhythmic visual stimuli

2019 ◽  
Author(s):  
D. Tabarelli ◽  
C. Keitel ◽  
J. Gross ◽  
D. Baldauf

AbstractSuccessfully interpreting and navigating our natural visual environment requires us to track its dynamics constantly. Additionally, we focus our attention on behaviorally relevant stimuli to enhance their neural processing. Little is known, however, about how sustained attention affects the ongoing tracking of stimuli with rich natural temporal dynamics. Here, we used MRI-informed source reconstructions of magnetoencephalography (MEG) data to map to what extent various cortical areas track concurrent continuous quasi-rhythmic visual stimulation. Further, we tested how top-down visuo-spatial attention influences this tracking process. Our bilaterally presented quasi-rhythmic stimuli covered a dynamic range of 4 – 20Hz, subdivided into three distinct bands. As an experimental control, we also included strictly rhythmic stimulation (10 vs 12 Hz). Using a spectral measure of brain-stimulus coupling, we were able to track the neural processing of left vs. right stimuli independently, even while fluctuating within the same frequency range. The fidelity of neural tracking depended on the stimulation frequencies, decreasing for higher frequency bands. Both attended and non-attended stimuli were tracked beyond early visual cortices, in ventral and dorsal streams depending on the stimulus frequency. In general, tracking improved with the deployment of visuo-spatial attention to the stimulus location. Our results provide new insights into how human visual cortices process concurrent dynamic stimuli and provide a potential mechanism – namely increasing the temporal precision of tracking – for boosting the neural representation of attended input.

2011 ◽  
Vol 105 (2) ◽  
pp. 674-686 ◽  
Author(s):  
Tetsuo Kida ◽  
Koji Inui ◽  
Emi Tanaka ◽  
Ryusuke Kakigi

Numerous studies have demonstrated effects of spatial attention within single sensory modalities (within-modal spatial attention) and the effect of directing attention to one sense compared with the other senses (intermodal attention) on cortical neuronal activity. Furthermore, recent studies have been revealing that the effects of spatial attention directed to a certain location in a certain sense spread to the other senses at the same location in space (cross-modal spatial attention). The present study used magnetoencephalography to examine the temporal dynamics of the effects of within-modal and cross-modal spatial and intermodal attention on cortical processes responsive to visual stimuli. Visual or tactile stimuli were randomly presented on the left or right side at a random interstimulus interval and subjects directed attention to the left or right when vision or touch was a task-relevant modality. Sensor-space analysis showed that a response around the occipitotemporal region at around 150 ms after visual stimulation was significantly enhanced by within-modal, cross-modal spatial, and intermodal attention. A later response over the right frontal region at around 200 ms was enhanced by within-modal spatial and intermodal attention, but not by cross-modal spatial attention. These effects were estimated to originate from the occipitotemporal and lateral frontal areas, respectively. Thus the results suggest different spatiotemporal dynamics of neural representations of cross-modal attention and intermodal or within-modal attention.


2017 ◽  
Author(s):  
Amra Covic ◽  
Christian Keitel ◽  
Emanuele Porcu ◽  
Erich Schröger ◽  
Matthias M Müller

ABSTRACTThe neural processing of a visual stimulus can be facilitated by attending to its position or by a co-occurring auditory tone. Using frequency-tagging we investigated whether facilitation by spatial attention and audio-visual synchrony rely on similar neural processes. Participants attended to one of two flickering Gabor patches (14.17 and 17 Hz) located in opposite lower visual fields. Gabor patches further “pulsed” (i.e. showed smooth spatial frequency variations) at distinct rates (3.14 and 3.63 Hz). Frequency-modulating an auditory stimulus at the pulse-rate of one of the visual stimuli established audio-visual synchrony. Flicker and pulsed stimulation elicited stimulus-locked rhythmic electrophysiological brain responses that allowed tracking the neural processing of simultaneously presented stimuli. These steady-state responses (SSRs) were quantified in the spectral domain to examine visual stimulus processing under conditions of synchronous vs. asynchronous tone presentation and when respective stimulus positions were attended vs. unattended. Strikingly, unique patterns of effects on pulse- and flicker driven SSRs indicated that spatial attention and audiovisual synchrony facilitated early visual processing in parallel and via different cortical processes. We found attention effects to resemble the classical top-down gain effect facilitating both, flicker and pulse-driven SSRs. Audio-visual synchrony, in turn, only amplified synchrony-producing stimulus aspects (i.e. pulse-driven SSRs) possibly highlighting the role of temporally co-occurring sights and sounds in bottom-up multisensory integration.


2004 ◽  
Vol 92 (5) ◽  
pp. 3030-3042 ◽  
Author(s):  
Jay Hegdé ◽  
David C. Van Essen

The firing rate of visual cortical neurons typically changes substantially during a sustained visual stimulus. To assess whether, and to what extent, the information about shape conveyed by neurons in visual area V2 changes over the course of the response, we recorded the responses of V2 neurons in awake, fixating monkeys while presenting a diverse set of static shape stimuli within the classical receptive field. We analyzed the time course of various measures of responsiveness and stimulus-related response modulation at the level of individual cells and of the population. For a majority of V2 cells, the response modulation was maximal during the initial transient response (40–80 ms after stimulus onset). During the same period, the population response was relatively correlated, in that V2 cells tended to respond similarly to specific subsets of stimuli. Over the ensuing 80–100 ms, the signal-to-noise ratio of individual cells generally declined, but to a lesser degree than the evoked-response rate during the corresponding time bins, and the response profiles became decorrelated for many individual cells. Concomitantly, the population response became substantially decorrelated. Our results indicate that the information about stimulus shape evolves dynamically and relatively rapidly in V2 during static visual stimulation in ways that may contribute to form discrimination.


2018 ◽  
Author(s):  
Christian Keitel ◽  
Anne Keitel ◽  
Christopher SY Benwell ◽  
Christoph Daube ◽  
Gregor Thut ◽  
...  

Two largely independent research lines use rhythmic sensory stimulation to study visual processing. Despite the use of strikingly similar experimental paradigms, they differ crucially in their notion of the stimulus-driven periodic brain responses: One regards them mostly as synchronised (entrained) intrinsic brain rhythms; the other assumes they are predominantly evoked responses (classically termed steady-state responses, or SSRs) that add to the ongoing brain activity. This conceptual difference can produce contradictory predictions about, and interpretations of, experimental outcomes. The effect of spatial attention on brain rhythms in the alpha-band (8-13 Hz) is one such instance: alpha-range SSRs have typically been found to increase in power when participants focus their spatial attention on laterally presented stimuli, in line with a gain control of the visual evoked response. In nearly identical experiments, retinotopic decreases in entrained alpha-band power have been reported, in line with the inhibitory function of intrinsic alpha. Here we reconcile these contradictory findings by showing that they result from a small but far-reaching difference between two common approaches to EEG spectral decomposition. In a new analysis of previously published EEG data, recorded during bilateral rhythmic visual stimulation, we find the typical SSR gain effect when emphasising stimulus-locked neural activity and the typical retinotopic alpha suppression when focusing on ongoing rhythms. These opposite but parallel effects suggest that spatial attention may bias the neural processing of dynamic visual stimulation via two complementary neural mechanisms.


2021 ◽  
Vol 15 ◽  
Author(s):  
Julian L. Amengual ◽  
Suliann Ben Hamed

Persistent activity has been observed in the prefrontal cortex (PFC), in particular during the delay periods of visual attention tasks. Classical approaches based on the average activity over multiple trials have revealed that such an activity encodes the information about the attentional instruction provided in such tasks. However, single-trial approaches have shown that activity in this area is rather sparse than persistent and highly heterogeneous not only within the trials but also between the different trials. Thus, this observation raised the question of how persistent the actually persistent attention-related prefrontal activity is and how it contributes to spatial attention. In this paper, we review recent evidence of precisely deconstructing the persistence of the neural activity in the PFC in the context of attention orienting. The inclusion of machine-learning methods for decoding the information reveals that attention orienting is a highly dynamic process, possessing intrinsic oscillatory dynamics working at multiple timescales spanning from milliseconds to minutes. Dimensionality reduction methods further show that this persistent activity dynamically incorporates multiple sources of information. This novel framework reflects a high complexity in the neural representation of the attention-related information in the PFC, and how its computational organization predicts behavior.


2020 ◽  
Author(s):  
Philipp Bartel ◽  
Filip K Janiak ◽  
Daniel Osorio ◽  
Tom Baden

The encoding of light increments and decrements by separate On- and Off- systems is a fundamental ingredient of vision, which supports the detection of edges in space and time and makes efficient use of limited dynamic range of visual neurons [1]. Theory predicts that the neural representation of On- and Off-signals should be approximately balanced, including across an animals’ full visible spectrum. Here we find that larval zebrafish violate this textbook expectation: in the fish brain, UV-stimulation near exclusively gives On-responses, blue/green-stimulation mostly Off- responses, and red-light alone elicits approximately balanced On- and Off-responses (see also [2–4]). We link these findings to zebrafish visual ecology, and suggest that the observed spectral tuning boosts the encoding of object “colourfulness”, which correlates with object proximity in their underwater world [5].


2017 ◽  
Author(s):  
Nicolas Burra ◽  
Dirk Kerzel ◽  
David Munoz ◽  
Didier Grandjean ◽  
Leonardo Ceravolo

Salient vocalizations, especially aggressive voices, are believed to attract attention due to an automatic threat detection system. However, studies assessing the temporal dynamics of auditory spatial attention to aggressive voices are missing. Using event-related potential markers of auditory spatial attention (N2ac and LPCpc), we show that attentional processing of threatening vocal signals is enhanced at two different stages of auditory processing. As early as 200 ms post stimulus onset, attentional orienting/engagement is enhanced for threatening as compared to happy vocal signals. Subsequently, as early as 400 ms post stimulus onset, the reorienting of auditory attention to the center of the screen (or disengagement from the target) is enhanced. This latter effect is consistent with the need to optimize perception by balancing the intake of stimulation from left and right auditory space. Our results extend the scope of theories from the visual to the auditory modality by showing that threatening stimuli also bias early spatial attention in the auditory modality. Although not the focus of the present work, we observed that the attentional enhancement was more pronounced in female than male participants.


2019 ◽  
Author(s):  
Chou P Hung ◽  
Chloe Callahan-Flintoft ◽  
Paul D Fedele ◽  
Kim F Fluitt ◽  
Onyekachi Odoemene ◽  
...  

ABSTRACTLuminance can vary widely when scanning across a scene, by up to 10^9 to 1, requiring multiple normalizing mechanisms spanning from the retina to cortex to support visual acuity and recognition. Vision models based on standard dynamic range luminance contrast ratios below 100 to 1 have limited ability to generalize to real-world scenes with contrast ratios over 10,000 to 1 (high dynamic range [HDR]). Understanding and modeling brain mechanisms of HDR luminance normalization is thus important for military applications, including automatic target recognition, display tone mapping, and camouflage. Yet, computer display of HDR stimuli was until recently unavailable or impractical for research. Here we describe procedures for setup, calibration, and precision check of an HDR display system with over 100,000 to 1 luminance dynamic range (650–0.0065 cd/m^2), pseudo 11-bit grayscale precision, and 3-ms temporal precision in the MATLAB/Psychtoolbox software environment. The setup is synchronized with electroencephalography and IR eye-tracking measurements. We report measures of HDR visual acuity and the discovery of a novel phenomenon—that abrupt darkening (from 400 to 4 cd/m^2) engages contextual facilitation, distorting the perceived orientation of a high-contrast central target. Surprisingly, the facilitation effect depended on luminance similarity, contradicting both classic divisive and subtractive models of contextual normalization.


Sign in / Sign up

Export Citation Format

Share Document