scholarly journals Eye-hand coordination during flexible manual interception of an abruptly appearing, moving target

2018 ◽  
Vol 119 (1) ◽  
pp. 221-234 ◽  
Author(s):  
Yuhui Li ◽  
Yong Wang ◽  
He Cui

As a vital skill in an evolving world, interception of moving objects relies on accurate prediction of target motion. In natural circumstances, active gaze shifts often accompany hand movements when exploring targets of interest, but how eye and hand movements are coordinated during manual interception and their dependence on visual prediction remain unclear. Here, we trained gaze-unrestrained monkeys to manually intercept targets appearing at random locations and circularly moving with random speeds. We found that well-trained animals were able to intercept the targets with adequate compensation for both sensory transmission and motor delays. Before interception, the animals' gaze followed the targets with adequate compensation for the sensory delay, but not for extra target displacement during the eye movements. Both hand and eye movements were modulated by target kinematics, and their reaction times were correlated. Moreover, retinal errors and reaching errors were correlated across different stages of reach execution. Our results reveal eye-hand coordination during manual interception, yet the eye and hand movements may show different levels of prediction based on the task context. NEW & NOTEWORTHY Here we studied the eye-hand coordination of monkeys during flexible manual interception of a moving target. Eye movements were untrained and not explicitly associated with reward. We found that the initial saccades toward the moving target adequately compensated for sensory transmission delays, but not for extra target displacement, whereas the reaching arm movements fully compensated for sensorimotor delays, suggesting that the mode of eye-hand coordination strongly depends on behavioral context.

2000 ◽  
Vol 24 (2) ◽  
pp. 335-338 ◽  
Author(s):  
Matthieu Lenoir ◽  
Luc Crevits ◽  
Maarten Goethals ◽  
Peter Duyck ◽  
Joanne Wildenbeest ◽  
...  

2020 ◽  
Vol 7 (1) ◽  
Author(s):  
John-Ross Rizzo ◽  
Mahya Beheshti ◽  
Tahereh Naeimi ◽  
Farnia Feiz ◽  
Girish Fatterpekar ◽  
...  

Abstract Background Eye–hand coordination (EHC) is a sophisticated act that requires interconnected processes governing synchronization of ocular and manual motor systems. Precise, timely and skillful movements such as reaching for and grasping small objects depend on the acquisition of high-quality visual information about the environment and simultaneous eye and hand control. Multiple areas in the brainstem and cerebellum, as well as some frontal and parietal structures, have critical roles in the control of eye movements and their coordination with the head. Although both cortex and cerebellum contribute critical elements to normal eye-hand function, differences in these contributions suggest that there may be separable deficits following injury. Method As a preliminary assessment for this perspective, we compared eye and hand-movement control in a patient with cortical stroke relative to a patient with cerebellar stroke. Result We found the onset of eye and hand movements to be temporally decoupled, with significant decoupling variance in the patient with cerebellar stroke. In contrast, the patient with cortical stroke displayed increased hand spatial errors and less significant temporal decoupling variance. Increased decoupling variance in the patient with cerebellar stroke was primarily due to unstable timing of rapid eye movements, saccades. Conclusion These findings highlight a perspective in which facets of eye-hand dyscoordination are dependent on lesion location and may or may not cooperate to varying degrees. Broadly speaking, the results corroborate the general notion that the cerebellum is instrumental to the process of temporal prediction for eye and hand movements, while the cortex is instrumental to the process of spatial prediction, both of which are critical aspects of functional movement control.


2007 ◽  
Vol 97 (1) ◽  
pp. 761-771 ◽  
Author(s):  
Uwe J. Ilg ◽  
Stefan Schumann

The contributions of the middle superior temporal area (MST) in the posterior parietal cortex of rhesus monkeys to the generation of smooth-pursuit eye movements as well as the contributions to motion perception are well established. Here, we present the first experimental evidence that this area also contributes to the generation of goal-directed hand movements toward a moving target. This evidence is based on the outcome of intracortical microstimulation experiments and transient lesions by small injections of muscimol at identified sites within the lateral part of area MST (MST-l). When microstimulation was applied during the execution of smooth-pursuit eye movements, postsaccadic eye velocity in the direction of the preferred direction of the stimulated site increased significantly (in 93 of 136 sites tested). When microstimulation was applied during a hand movement trial, the hand movement was displaced significantly in the same direction (in 28 of 39 sites tested). When we lesioned area MST-l transiently by injections of muscimol, steady-state eye velocity was exclusively reduced for ipsiversive smooth-pursuit eye movements. In contrast, hand movements were displaced toward the contralateral side, irrespective of the direction of the moving target. Our results provide evidence that area MST-l is involved in the processing of moving targets and plays a role in the execution of smooth-pursuit eye movements as well as visually guided hand movements.


PLoS ONE ◽  
2020 ◽  
Vol 15 (11) ◽  
pp. e0242818
Author(s):  
Jordan Navarro ◽  
Emma Hernout ◽  
François Osiurak ◽  
Emanuelle Reynaud

Eyes and hand movements are known to be coordinated during a variety of tasks. While steering a vehicle, gaze was observed to be tightly linked with steering wheel angle changes over time, with the eyes leading the hands. In this experiment, participants were asked to drive a winding road composed of bends with systematically manipulated radii of curvature, under regular and automatic steering conditions. With automatic steering, the vehicle followed the road, but the steering wheel and participants hands did not move. Despite the absence of physical eye-hand coordination in that condition, the eye and [what the hands should have done] to produce the action on the steering wheel were found to be coordinated, as under regular steering. This result brings a convincing piece of evidence that eye movements do more than just guiding the hands. In addition, eye-hand coordination was also found to be intermittent, context and person-dependant.


2012 ◽  
Vol 17 (4) ◽  
pp. 257-265 ◽  
Author(s):  
Carmen Munk ◽  
Günter Daniel Rey ◽  
Anna Katharina Diergarten ◽  
Gerhild Nieding ◽  
Wolfgang Schneider ◽  
...  

An eye tracker experiment investigated 4-, 6-, and 8-year old children’s cognitive processing of film cuts. Nine short film sequences with or without editing errors were presented to 79 children. Eye movements up to 400 ms after the targeted film cuts were measured and analyzed using a new calculation formula based on Manhattan Metrics. No age effects were found for jump cuts (i.e., small movement discontinuities in a film). However, disturbances resulting from reversed-angle shots (i.e., a switch of the left-right position of actors in successive shots) led to increased reaction times between 6- and 8-year old children, whereas children of all age groups had difficulties coping with narrative discontinuity (i.e., the canonical chronological sequence of film actions is disrupted). Furthermore, 4-year old children showed a greater number of overall eye movements than 6- and 8-year old children. This indicates that some viewing skills are developed between 4 and 6 years of age. The results of the study provide evidence of a crucial time span of knowledge acquisition for television-based media literacy between 4 and 8 years.


Perception ◽  
10.1068/p3066 ◽  
2000 ◽  
Vol 29 (6) ◽  
pp. 675-692 ◽  
Author(s):  
Beena Khurana ◽  
Katsumi Watanabe ◽  
Romi Nijhawan

Objects flashed in alignment with moving objects appear to lag behind [Nijhawan, 1994 Nature (London) 370 256–257], Could this ‘flash-lag’ effect be due to attentional delays in bringing flashed items to perceptual awareness [Titchener, 1908/1973 Lectures on the Elementary Psychology of Feeling and Attention first published 1908 (New York: Macmillan); reprinted 1973 (New York: Arno Press)]? We overtly manipulated attentional allocation in three experiments to address the following questions: Is the flash-lag effect affected when attention is (a) focused on a single event in the presence of multiple events, (b) distributed over multiple events, and (c) diverted from the flashed object? To address the first two questions, five rings, moving along a circular path, were presented while observers attentively tracked one or multiple rings under four conditions: the ring in which the disk was flashed was (i) known or (ii) unknown (randomly selected from the set of five); location of the flashed disk was (i) known or (ii) unknown (randomly selected from ten locations), The third question was investigated by using two moving objects in a cost – benefit cueing paradigm, An arrow cued, with 70% or 80% validity, the position of the flashed object, Observers performed two tasks: (a) reacted as quickly as possible to flash onset; (b) reported the flash-lag effect, We obtained a significant and unaltered flash-lag effect under all the attentional conditions we employed, Furthermore, though reaction times were significantly shorter for validly cued flashes, the flash-lag effect remained uninfluenced by cue validity, indicating that quicker responses to validly cued locations may be due to the shortening of post-perceptual delays in motor responses rather than the perceptual facilitation, We conclude that the computations that give rise to the flash-lag effect are independent of attentional deployment.


2015 ◽  
Vol 734 ◽  
pp. 203-206
Author(s):  
En Zeng Dong ◽  
Sheng Xu Yan ◽  
Kui Xiang Wei

In order to enhance the rapidity and the accuracy of moving target detection and tracking, and improve the speed of the algorithm on the DSP (digital signal processor), an active visual tracking system was designed based on the gaussian mixture background model and Meanshift algorithm on DM6437. The system use the VLIB library developed by TI, and through the method of gaussian mixture background model to detect the moving objects and use the Meanshift tracking algorithm based on color features to track the target in RGB space. Finally, the system is tested on the hardware platform, and the system is verified to be quickness and accuracy.


2002 ◽  
Vol 13 (2) ◽  
pp. 125-129 ◽  
Author(s):  
Hirokazu Ogawa ◽  
Yuji Takeda ◽  
Akihiro Yagi

Inhibitory tagging is a process that prevents focal attention from revisiting previously checked items in inefficient searches, facilitating search performance. Recent studies suggested that inhibitory tagging is object rather than location based, but it was unclear whether inhibitory tagging operates on moving objects. The present study investigated the tagging effect on moving objects. Participants were asked to search for a moving target among randomly and independently moving distractors. After either efficient or inefficient search, participants performed a probe detection task that measured the inhibitory effect on search items. The inhibitory effect on distractors was observed only after inefficient searches. The present results support the concept of object-based inhibitory tagging.


2021 ◽  
pp. 2150048
Author(s):  
Hamidreza Namazi ◽  
Avinash Menon ◽  
Ondrej Krejcar

Our eyes are always in search of exploring our surrounding environment. The brain controls our eyes’ activities through the nervous system. Hence, analyzing the correlation between the activities of the eyes and brain is an important area of research in vision science. This paper evaluates the coupling between the reactions of the eyes and the brain in response to different moving visual stimuli. Since both eye movements and EEG signals (as the indicator of brain activity) contain information, we employed Shannon entropy to decode the coupling between them. Ten subjects looked at four moving objects (dynamic visual stimuli) with different information contents while we recorded their EEG signals and eye movements. The results demonstrated that the changes in the information contents of eye movements and EEG signals are strongly correlated ([Formula: see text]), which indicates a strong correlation between brain and eye activities. This analysis could be extended to evaluate the correlation between the activities of other organs versus the brain.


Sign in / Sign up

Export Citation Format

Share Document