scholarly journals Compensating for a shifting world: evolving reference frames of visual and auditory signals across three multimodal brain areas

Author(s):  
Valeria C Caruso ◽  
Daniel S Pages ◽  
Marc A. Sommer ◽  
Jennifer M Groh

Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC). We recorded from single neurons in head-restrained monkeys performing auditory- and visually-guided saccades from variable initial fixation locations, and evaluated whether their receptive fields were better described as eye-centered, head-centered, or hybrid (i.e. not anchored uniquely to head- or eye-orientation). We found a progression of reference frames across areas and across time, with considerable hybrid-ness and persistent differences between modalities during most epochs/brain regions. For both modalities, the SC was more eye-centered than the FEF, which in turn was more eye-centered than the predominantly hybrid M/LIP. In all three areas and temporal epochs from stimulus onset to movement, visual signals were more eye-centered than auditory signals. In the SC and FEF, auditory signals became more eye-centered at the time of the saccade than they were initially after stimulus onset, but only in the SC at the time of the saccade did the auditory signals become predominantly eye-centered. The results indicate that visual and auditory signals both undergo transformations, ultimately reaching the same final reference frame but via different dynamics across brain regions and time.

2019 ◽  
Author(s):  
Valeria C. Caruso ◽  
Daniel S. Pages ◽  
Marc A. Sommer ◽  
Jennifer M. Groh

ABSTRACTStimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC). We recorded from single neurons in head-restrained monkeys performing auditory- and visually-guided saccades from variable initial fixation locations, and evaluated whether their receptive fields were better described as eye-centered, head-centered, or hybrid (i.e. not anchored uniquely to head- or eye-orientation). We found a progression of reference frames across areas and across time, with considerable hybrid-ness and persistent differences between modalities during most epochs/brain regions. For both modalities, the SC was more eye-centered than the FEF, which in turn was more eye-centered than the predominantly hybrid M/LIP. In all three areas and temporal epochs from stimulus onset to movement, visual signals were more eye-centered than auditory signals. In the SC and FEF, auditory signals became more eye-centered at the time of the saccade than they were initially after stimulus onset, but only in the SC at the time of the saccade did the auditory signals become predominantly eye-centered. The results indicate that visual and auditory signals both undergo transformations, ultimately reaching the same final reference frame but via different dynamics across brain regions and time.New and NoteworthyModels for visual-auditory integration posit that visual signals are eye-centered throughout the brain, while auditory signals are converted from head-centered to eye-centered coordinates. We show instead that both modalities largely employ hybrid reference frames: neither fully head-nor eye-centered. Across three hubs of the oculomotor network (Intraparietal Cortex, Frontal Eye Field and Superior Colliculus) visual and auditory signals evolve from hybrid to a common eye-centered format via different dynamics across brain areas and time.


2017 ◽  
Author(s):  
V. C. Caruso ◽  
D. S. Pages ◽  
M. A. Sommer ◽  
J. M. Groh

ABSTRACTWe accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. We assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior colliculus (SC). Single unit activity was assessed in head-restrained monkeys performing visually-guided saccades from different initial fixations. As previously shown, the receptive fields of most LIP/MIP neurons shifted to novel positions on the retina for each eye position, and these locations were not clearly related to each other in either eye- or head-centered coordinates (hybrid coordinates). In contrast, the receptive fields of most SC neurons were stable in eye-centered coordinates. In FEF, visual signals were intermediate between those patterns: around 60% were eye-centered, whereas the remainder showed changes in receptive field location, boundaries, or responsiveness that rendered the response patterns hybrid or occasionally head-centered. These results suggest that FEF may act as a transitional step in an evolution of coordinates between LIP/MIP and SC. The persistence across cortical areas of hybrid representations that do not provide unequivocal location labels in a consistent reference frame has implications for how these representations must be read-out.New & NoteworthyHow we perceive the world as stable using mobile retinas is poorly understood. We compared the stability of visual receptive fields across different fixation positions in three visuomotor regions. Irregular changes in receptive field position were ubiquitous in intraparietal cortex, evident but less common in the frontal eye fields, and negligible in the superior colliculus (SC), where receptive fields shifted reliably across fixations. Only the SC provides a stable labelled-line code for stimuli across saccades.


2018 ◽  
Vol 119 (4) ◽  
pp. 1411-1421 ◽  
Author(s):  
Valeria C. Caruso ◽  
Daniel S. Pages ◽  
Marc A. Sommer ◽  
Jennifer M. Groh

We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. In this study, we assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior colliculus (SC). Single-unit activity was assessed in head-restrained monkeys performing visually guided saccades from different initial fixations. As previously shown, the receptive fields of most LIP/MIP neurons shifted to novel positions on the retina for each eye position, and these locations were not clearly related to each other in either eye- or head-centered coordinates (defined as hybrid coordinates). In contrast, the receptive fields of most SC neurons were stable in eye-centered coordinates. In FEF, visual signals were intermediate between those patterns: around 60% were eye-centered, whereas the remainder showed changes in receptive field location, boundaries, or responsiveness that rendered the response patterns hybrid or occasionally head-centered. These results suggest that FEF may act as a transitional step in an evolution of coordinates between LIP/MIP and SC. The persistence across cortical areas of mixed representations that do not provide unequivocal location labels in a consistent reference frame has implications for how these representations must be read out. NEW & NOTEWORTHY How we perceive the world as stable using mobile retinas is poorly understood. We compared the stability of visual receptive fields across different fixation positions in three visuomotor regions. Irregular changes in receptive field position were ubiquitous in intraparietal cortex, evident but less common in the frontal eye fields, and negligible in the superior colliculus (SC), where receptive fields shifted reliably across fixations. Only the SC provides a stable labeled-line code for stimuli across saccades.


1987 ◽  
Vol 57 (1) ◽  
pp. 35-55 ◽  
Author(s):  
M. F. Jay ◽  
D. L. Sparks

Based on the findings of the preceding paper, it is known that auditory and visual signals have been translated into common coordinates at the level of the superior colliculus (SC) and share a motor circuit involved in the generation of saccadic eye movements. It is not known, however, whether the translation of sensory signals into motor coordinates occurs prior to or within the SC. Nor is it known in what coordinates auditory signals observed in the SC are encoded. The present experiment tested two alternative hypotheses concerning the frame of reference of auditory signals found in the deeper layers of the SC. The hypothesis that auditory signals are encoded in head coordinates predicts that, with the head stationary, the response of auditory neurons will not be affected by variations in eye position but will be determined by the location of the sound source. The hypothesis that auditory responses encode the trajectory of the eye movement required to look to the target (motor error) predicts that the response of auditory cells will depend on both the position of the sound source and the position of the eyes in the orbit. Extracellular single-unit recordings were obtained from neurons in the SC while monkeys made delayed saccades to auditory or visual targets in a darkened room. The coordinates of auditory signals were studied by plotting auditory receptive fields while the animal fixated one of three targets placed 24 degrees apart along the horizontal plane. For 99 of 121 SC cells, the spatial location of the auditory receptive field was significantly altered by the position of the eyes in the orbit. In contrast, the responses of five sound-sensitive cells isolated in the inferior colliculus were not affected by variations in eye position. The possibility that systematic variations in the position of the pinnae associated with different fixation positions could account for these findings was controlled for by plotting auditory receptive fields while the pinnae were mechanically restrained. Under these conditions, the position of the eyes in the orbit still had a significant effect on the responsiveness of collicular neurons to auditory stimuli. The average magnitude of the shift of the auditory receptive field with changes in eye position (12.9 degrees) did not correspond to the magnitude of the shift in eye position (24 degrees). Alternative explanations for this finding were considered. One possibility is that, within the SC, there is a gradual transition from auditory signals in head coordinates to signals in motor error coordinates.(ABSTRACT TRUNCATED AT 400 WORDS)


2017 ◽  
Author(s):  
Roel M. Willems ◽  
Franziska Hartung

Behavioral evidence suggests that engaging with fiction is positively correlated with social abilities. The rationale behind this link is that engaging with fictional narratives offers a ‘training modus’ for mentalizing and empathizing. We investigated the influence of the amount of reading that participants report doing in their daily lives, on connections between brain areas while they listened to literary narratives. Participants (N=57) listened to two literary narratives while brain activation was measured with fMRI. We computed time-course correlations between brain regions, and compared the correlation values from listening to narratives to listening to reversed speech. The between-region correlations were then related to the amount of fiction that participants read in their daily lives. Our results show that amount of fiction reading is related to functional connectivity in areas known to be involved in language and mentalizing. This suggests that reading fiction influences social cognition as well as language skills.


2010 ◽  
Vol 22 (5) ◽  
pp. 888-902 ◽  
Author(s):  
Marco Tamietto ◽  
Franco Cauda ◽  
Luca Latini Corazzini ◽  
Silvia Savazzi ◽  
Carlo A. Marzi ◽  
...  

Following destruction or deafferentation of primary visual cortex (area V1, striate cortex), clinical blindness ensues, but residual visual functions may, nevertheless, persist without perceptual consciousness (a condition termed blindsight). The study of patients with such lesions thus offers a unique opportunity to investigate what visual capacities are mediated by the extrastriate pathways that bypass V1. Here we provide evidence for a crucial role of the collicular–extrastriate pathway in nonconscious visuomotor integration by showing that, in the absence of V1, the superior colliculus (SC) is essential to translate visual signals that cannot be consciously perceived into motor outputs. We found that a gray stimulus presented in the blind field of a patient with unilateral V1 loss, although not consciously seen, can influence his behavioral and pupillary responses to consciously perceived stimuli in the intact field (implicit bilateral summation). Notably, this effect was accompanied by selective activations in the SC and in occipito-temporal extrastriate areas. However, when instead of gray stimuli we presented purple stimuli, which predominantly draw on S-cones and are thus invisible to the SC, any evidence of implicit visuomotor integration disappeared and activations in the SC dropped significantly. The present findings show that the SC acts as an interface between sensory and motor processing in the human brain, thereby providing a contribution to visually guided behavior that may remain functionally and anatomically segregated from the geniculo-striate pathway and entirely outside conscious visual experience.


1994 ◽  
Vol 71 (3) ◽  
pp. 1250-1253 ◽  
Author(s):  
G. S. Russo ◽  
C. J. Bruce

1. We studied neuronal activity in the monkey's frontal eye field (FEF) in conjunction with saccades directed to auditory targets. 2. All FEF neurons with movement activity preceding saccades to visual targets also were active preceding saccades to auditory targets, even when such saccades were made in the dark. Movement cells generally had comparable bursts for aurally and visually guided saccades; visuomovement cells often had weaker bursts in conjunction with aurally guided saccades. 3. When these cells were tested from different initial fixation directions, movement fields associated with aurally guided saccades, like fields mapped with visual targets, were a function of saccade dimensions, and not the speaker's spatial location. Thus, even though sound location cues are chiefly craniotopic, the crucial factor for a FEF discharge before aurally guided saccades was the location of auditory target relative to the current direction of gaze. 4. Intracortical microstimulation at the sites of these cells evoked constant-vector saccades, and not goal-directed saccades. The direction and size of electrically elicited saccades generally matched the cell's movement field for aurally guided saccades. 5. Thus FEF activity appears to have a role in aurally guided as well as visually guided saccades. Moreover, visual and auditory target representations, although initially obtained in different coordinate systems, appear to converge to a common movement vector representation at the FEF stage of saccadic processing that is appropriate for transmittal to saccade-related burst neurons in the superior colliculus and pons.


2005 ◽  
Vol 94 (6) ◽  
pp. 4156-4167 ◽  
Author(s):  
Daniel Zaksas ◽  
Tatiana Pasternak

Neurons in cortical area MT have localized receptive fields (RF) representing the contralateral hemifield and play an important role in processing visual motion. We recorded the activity of these neurons during a behavioral task in which two monkeys were required to discriminate and remember visual motion presented in the ipsilateral hemifield. During the task, the monkeys viewed two stimuli, sample and test, separated by a brief delay and reported whether they contained motion in the same or in opposite directions. Fifty to 70% of MT neurons were activated by the motion stimuli presented in the ipsilateral hemifield at locations far removed from their classical receptive fields. These responses were in the form of excitation or suppression and were delayed relative to conventional MT responses. Both excitatory and suppressive responses were direction selective, but the nature and the time course of their directionality differed from the conventional excitatory responses recorded with stimuli in the RF. Direction selectivity of the excitatory remote response was transient and early, whereas the suppressive response developed later and persisted after stimulus offset. The presence or absence of these unusual responses on error trials, as well as their magnitude, was affected by the behavioral significance of stimuli used in the task. We hypothesize that these responses represent top-down signals from brain region(s) accessing information about stimuli in the entire visual field and about the behavioral state of the animal. The recruitment of neurons in the opposite hemisphere during processing of behaviorally relevant visual signals reveals a mechanism by which sensory processing can be affected by cognitive task demands.


2020 ◽  
Vol 21 (12) ◽  
pp. 4503
Author(s):  
Sabah Nisar ◽  
Ajaz A. Bhat ◽  
Sheema Hashem ◽  
Najeeb Syed ◽  
Santosh K. Yadav ◽  
...  

Post-traumatic stress disorder (PTSD) is a highly disabling condition, increasingly recognized as both a disorder of mental health and social burden, but also as an anxiety disorder characterized by fear, stress, and negative alterations in mood. PTSD is associated with structural, metabolic, and molecular changes in several brain regions and the neural circuitry. Brain areas implicated in the traumatic stress response include the amygdala, hippocampus, and prefrontal cortex, which play an essential role in memory function. Abnormalities in these brain areas are hypothesized to underlie symptoms of PTSD and other stress-related psychiatric disorders. Conventional methods of studying PTSD have proven to be insufficient for diagnosis, measurement of treatment efficacy, and monitoring disease progression, and currently, there is no diagnostic biomarker available for PTSD. A deep understanding of cutting-edge neuroimaging genetic approaches is necessary for the development of novel therapeutics and biomarkers to better diagnose and treat the disorder. A current goal is to understand the gene pathways that are associated with PTSD, and how those genes act on the fear/stress circuitry to mediate risk vs. resilience for PTSD. This review article explains the rationale and practical utility of neuroimaging genetics in PTSD and how the resulting information can aid the diagnosis and clinical management of patients with PTSD.


1991 ◽  
Vol 66 (2) ◽  
pp. 559-579 ◽  
Author(s):  
J. D. Schall

1. The purpose of this study was to analyze the response properties of neurons in the frontal eye fields (FEF) of rhesus monkeys (Macaca mulatta) and to compare and contrast the various functional classes with those recorded in the supplementary eye fields (SEF) of the same animals performing the same go/no-go visual tracking task. Three hundred ten cells recorded in FEF provided the data for this investigation. 2. Visual cells in FEF responded to the stimuli that guided the eye movements. The visual cells in FEF responded with a slightly shorter latency and were more consistent and phasic in their activation than their counterparts in SEF. The receptive fields tended to emphasize the contralateral hemifield to the same extent as those observed in SEF visual cells. 3. Preparatory set cells began to discharge after the presentation of the target and ceased firing before the saccade, after the go/no-go cue was given. These neurons comprised a smaller proportion in FEF than in SEF. In contrast to their counterparts in SEF, the preparatory set cells in FEF did not respond preferentially in relation to contralateral movements, even though most responded preferentially for movements in one particular direction. The time course of the discharge of the FEF set cells was similar to that of their SEF counterparts, except that they reached their peak level of activation sooner. The few preparatory set cells in FEF tested with both auditory and visual stimuli tended to respond preferentially to the visual targets, whereas, in contrast, most set cells in SEF were bimodal. 4. Sensory-movement cells represented the largest population of cells recorded in FEF, responding in relation to both the presentation of the targets and the execution of the saccade. Although some of these sensory-movement cells resembled their counterparts in SEF by exhibiting a sustained elevation of activity, most of the FEF sensory-movement cells gave two discrete bursts, one after the presentation of the target and another before and during the saccade. Like their counterparts in SEF, the sensory-movement cells tended to be tuned for saccades into the contralateral hemifield, but this tendency was more pronounced in FEF than in SEF. The FEF sensory-movement cells discharged more briskly, with a shorter latency relative to the presentation of the target, than their counterparts in SEF. In addition, the FEF sensory-movement neurons reached their peak activation sooner than SEF sensory-movement neurons. Most FEF sensory-movement cells exhibited different patterns of activation in response to visual and auditory targets.(ABSTRACT TRUNCATED AT 400 WORDS)


Sign in / Sign up

Export Citation Format

Share Document