Intercepting moving objects: do eye movements matter?

Author(s):  
Eli Brenner ◽  
Jeroen B. J. Smeets
Keyword(s):  
2018 ◽  
Vol 119 (1) ◽  
pp. 221-234 ◽  
Author(s):  
Yuhui Li ◽  
Yong Wang ◽  
He Cui

As a vital skill in an evolving world, interception of moving objects relies on accurate prediction of target motion. In natural circumstances, active gaze shifts often accompany hand movements when exploring targets of interest, but how eye and hand movements are coordinated during manual interception and their dependence on visual prediction remain unclear. Here, we trained gaze-unrestrained monkeys to manually intercept targets appearing at random locations and circularly moving with random speeds. We found that well-trained animals were able to intercept the targets with adequate compensation for both sensory transmission and motor delays. Before interception, the animals' gaze followed the targets with adequate compensation for the sensory delay, but not for extra target displacement during the eye movements. Both hand and eye movements were modulated by target kinematics, and their reaction times were correlated. Moreover, retinal errors and reaching errors were correlated across different stages of reach execution. Our results reveal eye-hand coordination during manual interception, yet the eye and hand movements may show different levels of prediction based on the task context. NEW & NOTEWORTHY Here we studied the eye-hand coordination of monkeys during flexible manual interception of a moving target. Eye movements were untrained and not explicitly associated with reward. We found that the initial saccades toward the moving target adequately compensated for sensory transmission delays, but not for extra target displacement, whereas the reaching arm movements fully compensated for sensorimotor delays, suggesting that the mode of eye-hand coordination strongly depends on behavioral context.


2021 ◽  
pp. 2150048
Author(s):  
Hamidreza Namazi ◽  
Avinash Menon ◽  
Ondrej Krejcar

Our eyes are always in search of exploring our surrounding environment. The brain controls our eyes’ activities through the nervous system. Hence, analyzing the correlation between the activities of the eyes and brain is an important area of research in vision science. This paper evaluates the coupling between the reactions of the eyes and the brain in response to different moving visual stimuli. Since both eye movements and EEG signals (as the indicator of brain activity) contain information, we employed Shannon entropy to decode the coupling between them. Ten subjects looked at four moving objects (dynamic visual stimuli) with different information contents while we recorded their EEG signals and eye movements. The results demonstrated that the changes in the information contents of eye movements and EEG signals are strongly correlated ([Formula: see text]), which indicates a strong correlation between brain and eye activities. This analysis could be extended to evaluate the correlation between the activities of other organs versus the brain.


Author(s):  
Robert J. K. Jacob

The problem of human-computer interaction can be viewed as two powerful information processors (human and computer) attempting to communicate with each other via a narrow-bandwidth, highly constrained interface (Tufte, 1989). To address it, we seek faster, more natural, and more convenient means for users and computers to exchange information. The user’s side is constrained by the nature of human communication organs and abilities; the computer’s is constrained only by input/output devices and interaction techniques that we can invent. Current technology has been stronger in the computer-to-user direction than the user-to-computer, hence today’s user-computer dialogues are rather one-sided, with the bandwidth from the computer to the user far greater than that from user to computer. Using eye movements as a user-to-computer communication medium can help redress this imbalance. This chapter describes the relevant characteristics of the human eye, eye-tracking technology, how to design interaction techniques that incorporate eye movements into the user-computer dialogue in a convenient and natural way, and the relationship between eye-movement interfaces and virtual environments. As with other areas of research and design in human-computer interaction, it is helpful to build on the equipment and skills humans have acquired through evolution and experience and search for ways to apply them to communicating with a computer. Direct manipulation interfaces have enjoyed great success largely because they draw on analogies to existing human skills (pointing, grabbing, moving objects in space), rather than trained behaviors. Similarly, we try to make use of natural eye movements in designing interaction techniques for the eye. Because eye movements are so different from conventional computer inputs, our overall approach in designing interaction techniques is, wherever possible, to obtain information from a user’s natural eye movements while viewing the screen, rather than requiring the user to make specific trained eye movements to actuate the system. This requires careful attention to issues of human design, as will any successful work in virtual environments. The goal is for human-computer interaction to start with studies of the characteristics of human communication channels and skills and then develop devices, interaction techniques, and interfaces that communicate effectively to and from those channels.


2018 ◽  
Author(s):  
Adam P. Morris ◽  
Bart Krekelberg

SummaryHumans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina – and propagated throughout the visual cortical hierarchy – is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded “eye tracker” that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in V1 during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies 1-4, we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of (stationary) gaze direction. This decoded signal not only tracked the eye accurately during fixation, but also during fast and slow eye movements, even though the decoder had not been exposed to data from these behavioural states. Moreover, this signal lagged the real eye by approximately the time it took for new visual information to travel from the retina to cortex. Using simulations, we show that this V1 eye position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world.


2021 ◽  
Author(s):  
Philipp Kreyenmeier ◽  
Luca Kaemmer ◽  
Jolande Fooken ◽  
Miriam Spering

Objects in our visual environment often move unpredictably and can suddenly speed up or slow down. The ability to account for acceleration when interacting with moving objects can be critical for survival. Here, we investigate how human observers track an accelerating target with their eyes and predict its time of reappearance after a temporal occlusion by making an interceptive hand movement. Before occlusion, the target was initially visible and accelerated for a brief period. We tested how observers integrated target motion information by comparing three alternative models that predicted time-to-contact (TTC) based on the (1) final target velocity sample before occlusion, (2) average target velocity before occlusion, or (3) target acceleration. We show that visually-guided smooth pursuit eye movements reliably reflect target acceleration prior to occlusion. However, systematic saccade and manual interception timing errors reveal an inability to consider acceleration when predicting TTC. Interception timing is best described by the final velocity model that relies on extrapolating the last available velocity sample before occlusion. These findings provide compelling evidence for differential acceleration integration mechanisms in vision-guided eye movements and prediction-guided interception and a mechanistic explanation for the function and failure of interactions with accelerating objects.


2016 ◽  
Vol 9 (2) ◽  
Author(s):  
Kim Wende ◽  
Laetitia Theunissen ◽  
Marcus Missal

Causality is a unique feature of human perception. We present here a behavioral investigation of the influence of physical causality during visual pursuit of object collisions. Pursuit and saccadic eye movements of human subjects were recorded during ocular pursuit of two concurrently launched targets, one that moved according to the laws of Newtonian mechanics (the causal target) and the other one that moved in a physically implausible direction (the non-causal target). We found that anticipation of collision evoked early smooth pursuit decelerations. Saccades to non-causal targets were hypermetric and had latencies longer than saccades to causal targets. In conclusion, before and after a collision of two moving objects the oculomotor system implicitly predicts upcoming physically plausible target trajectories.


2000 ◽  
Vol 59 (2) ◽  
pp. 108-114 ◽  
Author(s):  
Kazuo Koga

Evidence is presented that eye movements have a strong modulation effect on perceived motion of an object in an induced motion situation. It was investigated whether pursuit eye movements affect motion perception, particularly target velocity perception, under the following stimulus conditions: (1) laterally moving objects on the computer display, (2) recurrent simple target motion and, (3) a unilaterally scrolling grid. The observers' eye movements were recorded and, at the same time, their responses with respect to their velocity perception were registered and analyzed in synchronization with the eye movement data. In most cases, when pursuit eye movements were synchronized with the movement of the target, the velocity of the target was judged to be slow or motionless. An explanation of the results is presented which is based on two sources of motion information: (1) A displacement detector in terms of retinal coordinates, and (2) a proprioceptive sensing unit associated with the eye movements. The veridicality of the judgments of the velocity of the object motion was determined by the complexity of the processes for integrating the signals from the two channels.


Sign in / Sign up

Export Citation Format

Share Document