gaze shifts
Recently Published Documents


TOTAL DOCUMENTS

268
(FIVE YEARS 26)

H-INDEX

44
(FIVE YEARS 2)

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Sebastian H Zahler ◽  
David E Taylor ◽  
Joey Y Wong ◽  
Julia M Adams ◽  
Evan H Feinberg

Animals investigate their environments by directing their gaze towards salient stimuli. In the prevailing view, mouse gaze shifts entail head rotations followed by brainstem-mediated eye movements, including saccades to reset the eyes. These 'recentering' saccades are attributed to head movement-related vestibular cues. However, microstimulating mouse superior colliculus (SC) elicits directed head and eye movements resembling SC-dependent sensory-guided gaze shifts in other species, suggesting that mouse gaze shifts may be more flexible than has been recognized. We investigated this possibility by tracking eye and attempted head movements in a head-fixed preparation that eliminates head movement-related sensory cues. We found tactile stimuli evoke directionally biased saccades coincident with attempted head rotations. Differences in saccade endpoints across stimuli are associated with distinct stimulus-dependent relationships between initial eye position and saccade direction and amplitude. Optogenetic perturbations revealed SC drives these gaze shifts. Thus, head-fixed mice make sensory-guided, SC-dependent gaze shifts involving coincident, directionally biased saccades and attempted head movements. Our findings uncover flexibility in mouse gaze shifts and provide a foundation for studying head-eye coupling.


2021 ◽  
Vol 79 ◽  
pp. 102853
Author(s):  
Cédrick T. Bonnet ◽  
Déborah Dubrulle ◽  
José A. Barela ◽  
Luc Defebvre ◽  
Arnaud Delval
Keyword(s):  

2021 ◽  
Vol 12 ◽  
Author(s):  
Maurits Adam ◽  
Christian Gumbsch ◽  
Martin V. Butz ◽  
Birgit Elsner

During the observation of goal-directed actions, infants usually predict the goal at an earlier age when the agent is familiar (e.g., human hand) compared to unfamiliar (e.g., mechanical claw). These findings implicate a crucial role of the developing agentive self for infants’ processing of others’ action goals. Recent theoretical accounts suggest that predictive gaze behavior relies on an interplay between infants’ agentive experience (top-down processes) and perceptual information about the agent and the action-event (bottom-up information; e.g., agency cues). The present study examined 7-, 11-, and 18-month-old infants’ predictive gaze behavior for a grasping action performed by an unfamiliar tool, depending on infants’ age-related action knowledge about tool-use and the display of the agency cue of producing a salient action effect. The results are in line with the notion of a systematic interplay between experience-based top-down processes and cue-based bottom-up information: Regardless of the salient action effect, predictive gaze shifts did not occur in the 7-month-olds (least experienced age group), but did occur in the 18-month-olds (most experienced age group). In the 11-month-olds, however, predictive gaze shifts occurred only when a salient action effect was presented. This sheds new light on how the developing agentive self, in interplay with available agency cues, supports infants’ action-goal prediction also for observed tool-use actions.


Author(s):  
Eckart Zimmermann

On average, we redirect our gaze with a frequency at about 3 Hz. In real life, gaze shifts consist of eye and head movements. Much research has focused on how the accuracy of eye movements is monitored and calibrated. By contrast, little is known about how head movements remain accurate. I wondered whether serial dependencies between artificially induced errors in head movement targeting and the immediately following head movement might recalibrate movement accuracy. I also asked whether head movement targeting errors would influence visual localization. To this end, participants wore a head mounted display and performed head movements to targets, which were displaced as soon as the start of the head movement was detected. I found that target displacements influenced head movement amplitudes in the same trial, indicating that participants could adjust their movement online to reach the new target location. However, I also found serial dependencies between the target displacement in trial n-1 and head movements amplitudes in the following trial n. I did not find serial dependencies between target displacements and visuomotor localization. The results reveal that serial dependencies recalibrate head movement accuracy.


Cognition ◽  
2021 ◽  
Vol 211 ◽  
pp. 104648
Author(s):  
Dekel Abeles ◽  
Shlomit Yuval-Greenberg
Keyword(s):  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Arvind Chandna ◽  
Jeremy Badler ◽  
Devashish Singh ◽  
Scott Watamaniuk ◽  
Stephen Heinen

AbstractTo clearly view approaching objects, the eyes rotate inward (vergence), and the intraocular lenses focus (accommodation). Current ocular control models assume both eyes are driven by unitary vergence and unitary accommodation commands that causally interact. The models typically describe discrete gaze shifts to non-accommodative targets performed under laboratory conditions. We probe these unitary signals using a physical stimulus moving in depth on the midline while recording vergence and accommodation simultaneously from both eyes in normal observers. Using monocular viewing, retinal disparity is removed, leaving only monocular cues for interpreting the object’s motion in depth. The viewing eye always followed the target’s motion. However, the occluded eye did not follow the target, and surprisingly, rotated out of phase with it. In contrast, accommodation in both eyes was synchronized with the target under monocular viewing. The results challenge existing unitary vergence command theories, and causal accommodation-vergence linkage.


2021 ◽  
Vol 9 (1) ◽  
Author(s):  
F. Robain ◽  
N. Kojovic ◽  
S. Solazzo ◽  
B. Glaser ◽  
M. Franchini ◽  
...  

Abstract Background Typical development of socio-communicative skills relies on keen observation of others. It thus follows that decreased social attention negatively impacts the subsequent development of socio-communicative abilities in children with autism spectrum disorders (ASD). In addition, studies indicate that social attention is modulated by context and that greater social difficulties are observed in more socially demanding situations. Our study aims to investigate the effect of social complexity on visual exploration of others’ actions in preschoolers. Methods To investigate the impact of social complexity, we used an eye-tracking paradigm with 26 typically developing preschoolers (TD, age = 3.60 ± 1.55) and 37 preschoolers with ASD (age = 3.55 ± 1.21). Participants were shown videos of two children engaging in socially simple play (parallel) versus socially complex play (interactive). We subsequently quantified the time spent and fixation duration on faces, objects, bodies, as well as the background and the number of spontaneous gaze shifts between socially relevant areas of interest. Results In the ASD group, we observed decreased time spent on faces. Social complexity (interactive play) elicited changes in visual exploration patterns in both groups. From the parallel to the interactive condition, we observed a shift towards socially relevant parts of the scene, a decrease in fixation duration, as well as an increase in spontaneous gaze shifts between faces and objects though there were fewer in the ASD group. Limitations Our results need to be interpreted cautiously due to relatively small sample sizes and may be relevant to male preschoolers, given our male-only sample and reported phenotypic differences between males and females. Conclusion Our results suggest that similar to TD children, though to a lesser extent, visual exploration patterns in ASD are modulated by context. Children with ASD that were less sensitive to context modulation showed decreased socio-communicative skills or higher levels of symptoms. Our findings support using naturalistic designs to capture socio-communicative deficits in ASD.


2021 ◽  
Author(s):  
Takaya Ogasawara ◽  
Fatih Sogukpinar ◽  
Kaining Zhang ◽  
Yang-Yang Feng ◽  
Julia Pai ◽  
...  

Humans and other primates interact with the world by observing and exploring visual objects. In particular, they often seek out the opportunities to view novel objects that they have never seen before, even when they have no extrinsic primary reward value. However, despite the importance of novel visual objects in our daily life, we currently lack an understanding of how primate brain circuits control the motivation to seek out novelty. We found that novelty-seeking is regulated by a small understudied subcortical region, the zona incerta (ZI). In a task in which monkeys made eye movements to familiar objects to obtain the opportunity to view novel objects, many ZI neurons were preferentially activated by predictions of future novel objects and displayed burst excitations before gaze shifts to gain access to novel objects. Low intensity electrical stimulation of ZI facilitated gaze shifts, while inactivations of ZI reduced novelty-seeking. Surprisingly, additional experiments showed that this ZI-dependent novelty seeking behavior is not regulated by canonical neural circuitry for reward seeking. The habenula-dopamine pathway, known to reflect reward predictions that control reward seeking, was relatively inactive during novelty-seeking behavior in which novelty had no extrinsic reward value. Instead, high channel-count electrophysiological experiments and anatomical tracing identified a prominent source of control signals for novelty seeking in the anterior ventral medial temporal cortex (AVMTC), a brain region known to be crucially involved in visual processing and object memory. In addition to their well-known function in signaling the novelty or familiarity of objects in the current environment, AVMTC neurons reflected the predictions of future novel objects, akin to the way neurons in reward-circuitry predict future rewards in order to control reward-seeking. Our data uncover a network of primate brain areas that regulate novelty-seeking. The behavioral and neural distinctions between novelty-seeking and reward-processing highlight how the brain can accomplish behavioral flexibility, providing a mechanism to explore novel objects.


2021 ◽  
Author(s):  
Christian Gumbsch ◽  
Maurits Adam ◽  
Birgit Elsner ◽  
Martin V. Butz

From about six months of age onwards, infants start to reliably fixate the goal of an observed action, such as a grasp, before the action is complete. The available research has identified a variety of factors that influence such goal-anticipatory gaze shifts, including the experience with the shown action events and familiarity with the observed agents. However, the underlying cognitive processes are still heavily debated. We propose that our minds (i) tend to structure sensorimotor dynamics into probabilistic, generative event- and event-boundary-predictive models, and, meanwhile, (ii) choose actions with the objective to minimize predicted uncertainty. We implement this proposition by means of event-predictive learning and active inference. The implemented learning mechanism induces an inductive, event-predictive bias, thus developing schematic encodings of experienced events and event boundaries. The implemented active inference principle chooses actions by aiming at minimizing expected future uncertainty. We train our system on multiple object-manipulation events. As a result, the generation of goal-anticipatory gaze shifts emerges while learning about object manipulations: the model starts fixating the inferred goal already at the start of an observed event after having sampled some experience with possible events and when a familiar agent (i.e., a hand) is involved. Meanwhile, the model keeps reactively tracking an unfamiliar agent (i.e a mechanical claw) that is performing the same movement. We conclude that event-predictive learning combined with active inference may be critical for eliciting infant action-goal prediction.


2021 ◽  
Vol 278 ◽  
pp. 280-287
Author(s):  
Johan Lundin Kleberg ◽  
Jens Högström ◽  
Karin Sundström ◽  
Andreas Frick ◽  
Eva Serlachius

Sign in / Sign up

Export Citation Format

Share Document