Performance of a Computer System for Recording and Analysing Eye Gaze Position Using an Infrared Light Device

2003 ◽  
Vol 18 (1) ◽  
pp. 39-44 ◽  
Author(s):  
Z. Ramdane-Cherif ◽  
A. Naït-Ali ◽  
J. F. Motsch ◽  
M. O. Krebs
2020 ◽  
Author(s):  
Woochul Choi ◽  
Hyeonsu Lee ◽  
Se-Bum Paik

AbstractBistable perception is characterized by periodic alternation between two different perceptual interpretations, the mechanism of which is poorly understood. Herein, we show that perceptual decisions in bistable perception are strongly correlated with slow rhythmic eye motion, the frequency of which varies across individuals. From eye gaze trajectory measurements during three types of bistable tasks, we found that each subject’s gaze position oscillates slowly(less than 1Hz), and that this frequency matches that of bistable perceptual alternation. Notably, the motion of the eye apparently moves in opposite directions before two opposite perceptual decisions, and this enables the prediction of the timing and direction of perceptual alternation from eye motion. We also found that the correlation between eye movement and a perceptual decision is maintained during variations of the alternation frequency by the intentional switching or retaining of perceived states. This result suggests that periodic bistable perception is phase-locked with rhythmic eye motion.


2019 ◽  
Vol 11 (7) ◽  
pp. 143
Author(s):  
Tanaka ◽  
Takenouchi ◽  
Ogawa ◽  
Yoshikawa ◽  
Nishio ◽  
...  

In semi-autonomous robot conferencing, not only the operator controls the robot, but the robot itself also moves autonomously. Thus, it can modify the operator’s movement (e.g., adding social behaviors). However, the sense of agency, that is, the degree of feeling that the movement of the robot is the operator’s own movement, would decrease if the operator is conscious of the discrepancy between the teleoperation and autonomous behavior. In this study, we developed an interface to control the robot head by using an eye tracker. When the robot autonomously moves its eye-gaze position, the interface guides the operator’s eye movement towards this autonomous movement. The experiment showed that our interface can maintain the sense of agency, because it provided the illusion that the autonomous behavior of a robot is directed by the operator’s eye movement. This study reports the conditions of how to provide this illusion in semi-autonomous robot conferencing.


2019 ◽  
Vol 42 (6) ◽  
pp. e18-e19
Author(s):  
Etty Bitton ◽  
Clémentine Branger
Keyword(s):  
Eye Gaze ◽  

2015 ◽  
Vol 15 (12) ◽  
pp. 206
Author(s):  
Celia Gagliardi ◽  
Arash Yazdanbakhsh
Keyword(s):  
Eye Gaze ◽  

Sensors ◽  
2020 ◽  
Vol 20 (7) ◽  
pp. 1917
Author(s):  
Ko-Feng Lee ◽  
Yen-Lin Chen ◽  
Chao-Wei Yu ◽  
Kai-Yi Chin ◽  
Chen-Han Wu

In this study, a head-mounted device was developed to track the gaze of the eyes and estimate the gaze point on the user’s visual plane. To provide a cost-effective vision tracking solution, this head-mounted device is combined with a sized endoscope camera, infrared light, and mobile phone; the devices are also implemented via 3D printing to reduce costs. Based on the proposed image pre-processing techniques, the system can efficiently extract and estimate the pupil ellipse from the camera module. A 3D eye model was also developed to effectively locate eye gaze points from extracted eye images. In the experimental results, average accuracy, precision, and recall rates of the proposed system can achieve an average of over 97%, which can demonstrate the efficiency of the proposed system. This study can be widely used in the Internet of Things, virtual reality, assistive devices, and human-computer interaction applications.


2010 ◽  
Vol 22 (03) ◽  
pp. 185-192 ◽  
Author(s):  
Jin-Yu Chu ◽  
Jian-De Sun ◽  
Xiao-Hui Yang ◽  
Ju Liu ◽  
Wei Liu

The gaze tracking system has become an active research field for handicapped persons as well as general people in recent years. The precise mapping method plays an important role in the system. In this paper, a novel infrared gaze tracking system based on nonuniform interpolation is proposed. In this system, the eye images for the computer to analyze are extracted under two infrared light sources and a charge-coupled device camera, and the users do not require wearing any device. First, the integral projection algorithm and canny edge detection are applied to extract the pupil boundary points from the captured eye images, and then the pupil center is computed using an efficient and accurate ellipse-fitting algorithm. Finally, to estimate where the user looks, a novel mapping method based on the nonuniform interpolation algorithm is proposed. In this mapping method, the complicated geometric eyeball model and the nonlinear mapping between the pupil center coordinates and computer monitor screen coordinates do not need to be taken into account. Experimental results show that the proposed mapping method is simple, fast and more accurate. Moreover, our system is the remote eye gaze tracking system. The users do not need to wear any device, which make the users feel more comfortable.


Sign in / Sign up

Export Citation Format

Share Document