Gaze Behavior When Reaching to Remembered Targets

2008 ◽  
Vol 100 (3) ◽  
pp. 1533-1543 ◽  
Author(s):  
J. Randall Flanagan ◽  
Yasuo Terao ◽  
Roland S. Johansson

People naturally direct their gaze to visible hand movement goals. Doing so improves reach accuracy through use of signals related to gaze position and visual feedback of the hand. Here, we studied where people naturally look when acting on remembered target locations. Four targets were presented on a screen, in peripheral vision, while participants fixed a central cross (encoding phase). Four seconds later, participants used a pen to mark the remembered locations while free to look wherever they wished (recall phase). Visual references, including the screen and the cross, were present throughout. During recall, participants neither looked at the marked locations nor prevented eye movements. Instead, gaze behavior was erratic and was comprised of gaze shifts loosely coupled in time and space with hand movements. To examine whether eye and hand movements during encoding affected gaze behavior during recall, in additional encoding conditions, participants marked the visible targets with either free gaze or with central cross fixation or just looked at the targets. All encoding conditions yielded similar erratic gaze behavior during recall. Furthermore, encoding mode did not influence recall performance, suggesting that participants, during recall, did not exploit sensorimotor memories related to hand and gaze movements during encoding. Finally, we recorded a similar lose coupling between hand and eye movements during an object manipulation task performed in darkness after participants had viewed the task environment. We conclude that acting on remembered versus visible targets can engage fundamentally different control strategies, with gaze largely decoupled from movement goals during memory-guided actions.

2019 ◽  
Vol 121 (5) ◽  
pp. 1967-1976 ◽  
Author(s):  
Niels Gouirand ◽  
James Mathew ◽  
Eli Brenner ◽  
Frederic R. Danion

Adapting hand movements to changes in our body or the environment is essential for skilled motor behavior. Although eye movements are known to assist hand movement control, how eye movements might contribute to the adaptation of hand movements remains largely unexplored. To determine to what extent eye movements contribute to visuomotor adaptation of hand tracking, participants were asked to track a visual target that followed an unpredictable trajectory with a cursor using a joystick. During blocks of trials, participants were either allowed to look wherever they liked or required to fixate a cross at the center of the screen. Eye movements were tracked to ensure gaze fixation as well as to examine free gaze behavior. The cursor initially responded normally to the joystick, but after several trials, the direction in which it responded was rotated by 90°. Although fixating the eyes had a detrimental influence on hand tracking performance, participants exhibited a rather similar time course of adaptation to rotated visual feedback in the gaze-fixed and gaze-free conditions. More importantly, there was extensive transfer of adaptation between the gaze-fixed and gaze-free conditions. We conclude that although eye movements are relevant for the online control of hand tracking, they do not play an important role in the visuomotor adaptation of such tracking. These results suggest that participants do not adapt by changing the mapping between eye and hand movements, but rather by changing the mapping between hand movements and the cursor’s motion independently of eye movements. NEW & NOTEWORTHY Eye movements assist hand movements in everyday activities, but their contribution to visuomotor adaptation remains largely unknown. We compared adaptation of hand tracking under free gaze and fixed gaze. Although our results confirm that following the target with the eyes increases the accuracy of hand movements, they unexpectedly demonstrate that gaze fixation does not hinder adaptation. These results suggest that eye movements have distinct contributions for online control and visuomotor adaptation of hand movements.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
John-Ross Rizzo ◽  
Mahya Beheshti ◽  
Tahereh Naeimi ◽  
Farnia Feiz ◽  
Girish Fatterpekar ◽  
...  

Abstract Background Eye–hand coordination (EHC) is a sophisticated act that requires interconnected processes governing synchronization of ocular and manual motor systems. Precise, timely and skillful movements such as reaching for and grasping small objects depend on the acquisition of high-quality visual information about the environment and simultaneous eye and hand control. Multiple areas in the brainstem and cerebellum, as well as some frontal and parietal structures, have critical roles in the control of eye movements and their coordination with the head. Although both cortex and cerebellum contribute critical elements to normal eye-hand function, differences in these contributions suggest that there may be separable deficits following injury. Method As a preliminary assessment for this perspective, we compared eye and hand-movement control in a patient with cortical stroke relative to a patient with cerebellar stroke. Result We found the onset of eye and hand movements to be temporally decoupled, with significant decoupling variance in the patient with cerebellar stroke. In contrast, the patient with cortical stroke displayed increased hand spatial errors and less significant temporal decoupling variance. Increased decoupling variance in the patient with cerebellar stroke was primarily due to unstable timing of rapid eye movements, saccades. Conclusion These findings highlight a perspective in which facets of eye-hand dyscoordination are dependent on lesion location and may or may not cooperate to varying degrees. Broadly speaking, the results corroborate the general notion that the cerebellum is instrumental to the process of temporal prediction for eye and hand movements, while the cortex is instrumental to the process of spatial prediction, both of which are critical aspects of functional movement control.


Author(s):  
Philipp Kreyenmeier ◽  
Heiner Deubel ◽  
Nina M. Hanning

AbstractAttention shifts that precede goal-directed eye and hand movements are regarded as markers of motor target selection. Whether effectors compete for a single, shared attentional resource during simultaneous eye-hand movements or whether attentional resources can be allocated independently towards multiple target locations is controversially debated. Independent, effector-specific target selection mechanisms underlying parallel allocation of visuospatial attention to saccade and reach targets would predict an increase of the overall attention capacity with the number of active effectors. We test this hypothesis in a modified Theory of Visual Attention (TVA; Bundesen, 1990) paradigm. Participants reported briefly presented letters during eye, hand, or combined eye-hand movement preparation to centrally cued locations. Modeling the data according to TVA allowed us to assess both the overall attention capacity and the deployment of visual attention to individual locations in the visual work space. In two experiments, we show that attention is predominantly allocated to the motor targets – without pronounced competition between effectors. The parallel benefits at eye and hand targets, however, have concomitant costs at non-motor locations, and the overall attention capacity does not increase by the simultaneous recruitment of both effector systems. Moreover, premotor shifts of attention dominate over voluntary deployment of processing resources, yielding severe impairments of voluntary attention allocation. We conclude that attention shifts to multiple effector targets without mutual competition given that sufficient processing resources can be withdrawn from movement-irrelevant locations.


2006 ◽  
Vol 96 (3) ◽  
pp. 1358-1369 ◽  
Author(s):  
Gerben Rotman ◽  
Nikolaus F. Troje ◽  
Roland S. Johansson ◽  
J. Randall Flanagan

We previously showed that, when observers watch an actor performing a predictable block-stacking task, the coordination between the observer's gaze and the actor's hand is similar to the coordination between the actor's gaze and hand. Both the observer and the actor direct gaze to forthcoming grasp and block landing sites and shift their gaze to the next grasp or landing site at around the time the hand contacts the block or the block contacts the landing site. Here we compare observers' gaze behavior in a block manipulation task when the observers did and when they did not know, in advance, which of two blocks the actor would pick up first. In both cases, observers managed to fixate the target ahead of the actor's hand and showed proactive gaze behavior. However, these target fixations occurred later, relative to the actor's movement, when observers did not know the target block in advance. In perceptual tests, in which observers watched animations of the actor reaching partway to the target and had to guess which block was the target, we found that the time at which observers were able to correctly do so was very similar to the time at which they would make saccades to the target block. Overall, our results indicate that observers use gaze in a fashion that is appropriate for hand movement planning and control. This in turn suggests that they implement representations of the manual actions required in the task and representations that direct task-specific eye movements.


2017 ◽  
Vol 118 (1) ◽  
pp. 404-415 ◽  
Author(s):  
Philipp Kreyenmeier ◽  
Jolande Fooken ◽  
Miriam Spering

In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object (“ball”) with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated “hit zone.” In two experiments ( n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points. NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points.


2007 ◽  
Vol 97 (1) ◽  
pp. 761-771 ◽  
Author(s):  
Uwe J. Ilg ◽  
Stefan Schumann

The contributions of the middle superior temporal area (MST) in the posterior parietal cortex of rhesus monkeys to the generation of smooth-pursuit eye movements as well as the contributions to motion perception are well established. Here, we present the first experimental evidence that this area also contributes to the generation of goal-directed hand movements toward a moving target. This evidence is based on the outcome of intracortical microstimulation experiments and transient lesions by small injections of muscimol at identified sites within the lateral part of area MST (MST-l). When microstimulation was applied during the execution of smooth-pursuit eye movements, postsaccadic eye velocity in the direction of the preferred direction of the stimulated site increased significantly (in 93 of 136 sites tested). When microstimulation was applied during a hand movement trial, the hand movement was displaced significantly in the same direction (in 28 of 39 sites tested). When we lesioned area MST-l transiently by injections of muscimol, steady-state eye velocity was exclusively reduced for ipsiversive smooth-pursuit eye movements. In contrast, hand movements were displaced toward the contralateral side, irrespective of the direction of the moving target. Our results provide evidence that area MST-l is involved in the processing of moving targets and plays a role in the execution of smooth-pursuit eye movements as well as visually guided hand movements.


Leonardo ◽  
2001 ◽  
Vol 34 (1) ◽  
pp. 35-40 ◽  
Author(s):  
R.C. Miall ◽  
John Tchalenko

The mental processes that al-low an artist to transform visual images-e.g. those of his model-into a picture on the canvas are not easily studied. The authors re-port work measuring the eye and hand movements of a single artist, chosen for his detailed and realis-tic portraits produced from life. His eye fixations when painting or drawing were of twice the duration of those when he was not painting and also quite different from those of novice artists. His eye-hand co-ordination pattern also showed dif-ferences from that of novices, be-ing more temporally consistent. This preliminary work suggests that detailed and quantitative analy-sis of a working artist is feasible and will illuminate the process of artistic creation.


2019 ◽  
Vol 12 (2) ◽  
Author(s):  
Rebecca Martina Foerster

When performing manual actions, eye movements precede hand movements to target locations: Before we grasp an object, we look at it. Eye-hand guidance is even preserved when visual targets are unavailable, e.g., grasping behind an occlusion. This “looking-at-nothing” behavior might be functional, e.g., as “deictic pointer” for manual control or as memory-retrieval cue, or a by-product of automatization. Here, it is studied if looking at empty locations before acting on them is beneficial for sensorimotor performance. In five experiments, participants completed a click sequence on eight visual targets for 0-100 trials while they had either to fixate on the screen center or could move their eyes freely. During 50-100 consecutive trials, participants clicked the same sequence on a blank screen with free or fixed gaze. During both phases, participants looked at target locations when gaze shifts were allowed. With visual targets, target fixations led to faster, more precise clicking, fewer errors, and sparser cursor-paths than central fixation. Without visual information, a tiny free-gaze benefit could sometimes be observed and was rather a memory than a motor-calculation benefit. Interestingly, central fixation during learning forced early explicit encoding causing a strong benefit for acting on remembered targets later, independent of whether eyes moved then.


2003 ◽  
Vol 9 (1) ◽  
pp. 44-54 ◽  
Author(s):  
P Feys ◽  
W F Helsen ◽  
A Lavrysen ◽  
B Nuttin ◽  
P Ketelaer

A ccurate goal-directed movements toward a visual target require a precise coordination of both the oculomotor and limb motor systems. Intentio n tremor and eye movement deficits are frequently observed in multiple sclerosis (MS). The goal of this study was to examine the characteristics of intentio n tremor and simultaneously produced eye movements during rapid goal-directed movements. Eye and hand movements were synchronously measured in 16 MS patients with intentio n tremor and 16 control subjects. Manual performances of the patient group were character ized by a delayed onset, slower executio n and aiming inaccuracies. In line with the clinically defined picture of intention tremor, differences between patients and control subjects were most pronounced toward the end of the movement. Dependent variables were obviously greater in MS patients compared with control subjects, and correlated well with clinical outcome measures. The application of an inertial load to the limb did not show any effect on intention tremor. In addition to impaired limb coordination, evidence has been found that eye movements, too, were abnormal in patients compared with control subjects. Moreover, eye and hand movement deficits seemed to be closely related, suggesting a common underlying command structure. Inaccurate eye movements were likely to hamper an accurate motor performance of the hand.


2020 ◽  
Vol 123 (4) ◽  
pp. 1439-1447
Author(s):  
Jolande Fooken ◽  
Miriam Spering

Real-world tasks, such as avoiding obstacles, require a sequence of interdependent choices to reach accurate motor actions. Yet, most studies on primate decision making involve simple one-step choices. Here we analyze motor actions to investigate how sensorimotor decisions develop over time. In a go/no-go interception task human observers ( n = 42) judged whether a briefly presented moving target would pass (interceptive hand movement required) or miss (no hand movement required) a strike box while their eye and hand movements were recorded. Go/no-go decision formation had to occur within the first few hundred milliseconds to allow time-critical interception. We found that the earliest time point at which eye movements started to differentiate actions (go versus no-go) preceded hand movement onset. Moreover, eye movements were related to different stages of decision making. Whereas higher eye velocity during smooth pursuit initiation was related to more accurate interception decisions (whether or not to act), faster pursuit maintenance was associated with more accurate timing decisions (when to act). These results indicate that pursuit initiation and maintenance are continuously linked to ongoing sensorimotor decision formation. NEW & NOTEWORTHY Here we show that eye movements are a continuous indicator of decision processes underlying go/no-go actions. We link different stages of decision formation to distinct oculomotor events during open- and closed-loop smooth pursuit. Critically, the earliest time point at which eye movements differentiate actions preceded hand movement onset, suggesting shared sensorimotor processing for eye and hand movements. These results emphasize the potential of studying eye movements as a readout of cognitive processes.


Sign in / Sign up

Export Citation Format

Share Document