Slow Fluctuations in Attentional Control of Sensory Cortex

2011 ◽  
Vol 23 (2) ◽  
pp. 460-470 ◽  
Author(s):  
Julia W. Y. Kam ◽  
Elizabeth Dao ◽  
James Farley ◽  
Kevin Fitzpatrick ◽  
Jonathan Smallwood ◽  
...  

Top–down control of visual sensory cortex has long been tied to the orienting of visual spatial attention on a rapid, moment-to-moment basis. Here, we examined whether sensory responses in visual cortex are also modulated by natural and comparatively slower fluctuations in whether or not one is paying attention to the task at hand. Participants performed a simple visual discrimination task at fixation as the ERPs to task-irrelevant probes in the upper visual periphery were recorded. At random intervals, participants were stopped and asked to report on their attentional state at the time of stoppage—either “on-task” or “off-task.” ERPs to the probes immediately preceding these subjective reports were then examined as a function of whether attention was in an on-task versus off-task state. We found that sensory-evoked responses to the probes were significantly attenuated during off-task relative to on-task states, as measured by the visual P1 ERP component. In two additional experiments, we replicated this effect while (1) finding that off-task sensory attenuation extends to the auditory domain, as measured by the auditory N1 ERP component, and (2) eliminating state-dependent shifts in general arousal as a possible explanation for the effects. Collectively, our findings suggest that sensory gain control in cortex is yoked to the natural ebb and flow in how much attention we pay to the current task over time.

2021 ◽  
Author(s):  
Deepa L Ramamurthy ◽  
Andrew Chen ◽  
Patrick C Huang ◽  
Priyanka Bharghavan ◽  
Gayathri Krishna ◽  
...  

Vasoactive intestinal peptide-expressing (VIP) interneurons, which constitute 10-15% of the cortical inhibitory neuron population, have emerged as an important cell type for regulating excitatory cell activity based on behavioral state. VIP cells in sensory cortex are potently engaged by neuromodulatory and motor inputs during active exploratory behaviors like locomotion and whisking, which in turn promote pyramidal cell firing via disinhibition. Such state-dependent modulation of activity by VIP cells in sensory cortex has been studied widely in recent years. However, the function of VIP cells during goal-directed behavior is less well understood. It is not clear how task-related events like sensory stimuli, motor actions, or reward activate VIP cells in sensory cortex since there is often temporal overlap in the occurrence of these events. We developed a Go/NoGo whisker touch detection task which incorporates a post-stimulus delay period to separate sensory-driven activity from action- or reward-related activity during behavior. We used 2-photon calcium imaging to measure task-related signals of L2/3 VIP neurons in S1 of behaving mice. We report for the first time that VIP cells in mouse whisker S1 are activated by both whisker stimuli and goal-directed licking. Whisker- and lick-related signals were spatially organized in relation to anatomical columns in S1. Sensory responses of VIP cells were tuned to specific whiskers, whether or not they also displayed lick-related activity.


2010 ◽  
Vol 24 (3) ◽  
pp. 198-209 ◽  
Author(s):  
Yan Wang ◽  
Jianhui Wu ◽  
Shimin Fu ◽  
Yuejia Luo

In the present study, we used event-related potentials (ERPs) and behavioral measurements in a peripherally cued line-orientation discrimination task to investigate the underlying mechanisms of orienting and focusing in voluntary and involuntary attention conditions. Informative peripheral cue (75% valid) with long stimulus onset asynchrony (SOA) was used in the voluntary attention condition; uninformative peripheral cue (50% valid) with short SOA was used in the involuntary attention condition. Both orienting and focusing were affected by attention type. Results for attention orienting in the voluntary attention condition confirmed the “sensory gain control theory,” as attention enhanced the amplitude of the early ERP components, P1 and N1, without latency changes. In the involuntary attention condition, compared with invalid trials, targets in the valid trials elicited larger and later contralateral P1 components, and smaller and later contralateral N1 components. Furthermore, but only in the voluntary attention condition, targets in the valid trials elicited larger N2 and P3 components than in the invalid trials. Attention focusing in the involuntary attention condition resulted in larger P1 components elicited by targets in small-cue trials compared to large-cue trials, whereas in the voluntary attention condition, larger P1 components were elicited by targets in large-cue trials than in small-cue trials. There was no interaction between orienting and focusing. These results suggest that orienting and focusing of visual-spatial attention are deployed independently regardless of attention type. In addition, the present results provide evidence of dissociation between voluntary and involuntary attention during the same task.


2013 ◽  
Vol 110 (3) ◽  
pp. 621-639 ◽  
Author(s):  
Bryan M. Krause ◽  
Matthew I. Banks

The neural mechanisms of sensory responses recorded from the scalp or cortical surface remain controversial. Evoked vs. induced response components (i.e., changes in mean vs. variance) are associated with bottom-up vs. top-down processing, but trial-by-trial response variability can confound this interpretation. Phase reset of ongoing oscillations has also been postulated to contribute to sensory responses. In this article, we present evidence that responses under passive listening conditions are dominated by variable evoked response components. We measured the mean, variance, and phase of complex time-frequency coefficients of epidurally recorded responses to acoustic stimuli in rats. During the stimulus, changes in mean, variance, and phase tended to co-occur. After the stimulus, there was a small, low-frequency offset response in the mean and modest, prolonged desynchronization in the alpha band. Simulations showed that trial-by-trial variability in the mean can account for most of the variance and phase changes observed during the stimulus. This variability was state dependent, with smallest variability during periods of greatest arousal. Our data suggest that cortical responses to auditory stimuli reflect variable inputs to the cortical network. These analyses suggest that caution should be exercised when interpreting variance and phase changes in terms of top-down cortical processing.


2021 ◽  
Author(s):  
Anton Filipchuk ◽  
Alain Destexhe ◽  
Brice Bathellier

AbstractNeural activity in sensory cortex combines stimulus responses and ongoing activity, but it remains unclear whether they reflect the same underlying dynamics or separate processes. Here we show that during wakefulness, the neuronal assemblies evoked by sounds in the auditory cortex and thalamus are specific to the stimulus and distinct from the assemblies observed in ongoing activity. In contrast, during anesthesia, evoked assemblies are indistinguishable from ongoing assemblies in cortex, while they remain distinct in the thalamus. A strong remapping of sensory responses accompanies this dynamical state change produced by anesthesia. Together, these results show that the awake cortex engages dedicated neuronal assemblies in response to sensory inputs, which we suggest is a network correlate of sensory perception.One-Sentence SummarySensory responses in the awake cortex engage specific neuronal assemblies that disappear under anesthesia.


1993 ◽  
Vol 5 (2) ◽  
pp. 188-195 ◽  
Author(s):  
Steven J. Luck ◽  
Silu Fan ◽  
Steven A. Hillyard

When subjects are explicitly cued to focus attention on a particular location in visual space, targets presented at that location have been shown to elicit enhanced sensory-evoked activity in recordings of event-related brain potentials (ERPs). The present study sought to determine if this type of sensory facilitation also occurs during visual search tasks in which a feature conjunction target must be identified, presumably by means of focal attention, within an array of distractor items. In this experiment, subjects were required to discriminate the shape of a distinctively colored target item within an array containing 15 distractor items, and ERPs were elicited by task-irrelevant probe stimuli that were presented at the location of the target item or at the location of a distractor item on the opposite side of the array. When the delay between search-array onset and probe onset was 250 msec, the sensory-evoked responses in the latency range 75-200 msec were larger for probes presented at the location of the target than for probes presented at the location of the irrelevant distractor. These results indicate that sensory processing is modulated in a spatially restricted manner during visual search, and that focusing attention on a feature conjunction target engages neural systems that are shared with other forms of visual-spatial attention.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
M. Berk Mirza ◽  
Rick A. Adams ◽  
Karl Friston ◽  
Thomas Parr

Abstract Information gathering comprises actions whose (sensory) consequences resolve uncertainty (i.e., are salient). In other words, actions that solicit salient information cause the greatest shift in beliefs (i.e., information gain) about the causes of our sensations. However, not all information is relevant to the task at hand: this is especially the case in complex, naturalistic scenes. This paper introduces a formal model of selective attention based on active inference and contextual epistemic foraging. We consider a visual search task with a special emphasis on goal-directed and task-relevant exploration. In this scheme, attention modulates the expected fidelity (precision) of the mapping between observations and hidden states in a state-dependent or context-sensitive manner. This ensures task-irrelevant observations have little expected information gain, and so the agent – driven to reduce expected surprise (i.e., uncertainty) – does not actively seek them out. Instead, it selectively samples task-relevant observations, which inform (task-relevant) hidden states. We further show, through simulations, that the atypical exploratory behaviours in conditions such as autism and anxiety may be due to a failure to appropriately modulate sensory precision in a context-specific way.


2012 ◽  
Vol 25 (0) ◽  
pp. 99
Author(s):  
Samuel Couth ◽  
Ellen Poliakoff ◽  
Emma Gowen

Reaching and grasping requires integration of visual, proprioceptive and somatosensory inputs. Previous research has shown that manipulating the ‘graspabilty’ of a visual stimulus influences reaction times to that stimulus (e.g., Tucker and Ellis, 1998). Here we explored whether this same effect can be extended to the planning and online control of arm movements. Participants made a mimed reaching movement with their left or right hand depending on the colour of images of affordance (door handles) and control stimuli (a row of dots of similar size and orientation as the door handle). Stimulus onset was manipulated by changing when the grey stimulus changed colour. Stimuli either pointed towards (compatible) or pointed away from (incompatible) the responding hand. Spatially compatible affordance stimuli facilitated reach onset compared to other stimuli and compatibility combinations, replicating previous reaction time studies. This can be attributed to a priming of the motor system by spatially compatible affording items. Results also indicated a larger outwards deviation of reach trajectory for spatially incompatible control stimuli compared to spatially compatible control stimuli, which waned with stimulus onset delay. This reveals an immediate inhibitory effect on reach trajectory, such that outwards movement is over-compensated to negate this incompatible orientation. Overall, we observed that the effect of visual spatial compatibility on reach kinematics differs with the action relevance of the stimulus. We are currently exploring how this multisensory visuomotor effect changes with age.


Sign in / Sign up

Export Citation Format

Share Document