Eye-Tracking – Visuelle Blickrichtungserfassung im dreidimensionalen Raum (Eye-Tracking – Visual Gaze Estimation in 3D Space)

2008 ◽  
Vol 75 (7-8/2008) ◽  
Author(s):  
Mohammad Yasser Al Nahlaoui
Vision ◽  
2018 ◽  
Vol 2 (3) ◽  
pp. 35 ◽  
Author(s):  
Braiden Brousseau ◽  
Jonathan Rose ◽  
Moshe Eizenman

The most accurate remote Point of Gaze (PoG) estimation methods that allow free head movements use infrared light sources and cameras together with gaze estimation models. Current gaze estimation models were developed for desktop eye-tracking systems and assume that the relative roll between the system and the subjects’ eyes (the ’R-Roll’) is roughly constant during use. This assumption is not true for hand-held mobile-device-based eye-tracking systems. We present an analysis that shows the accuracy of estimating the PoG on screens of hand-held mobile devices depends on the magnitude of the R-Roll angle and the angular offset between the visual and optical axes of the individual viewer. We also describe a new method to determine the PoG which compensates for the effects of R-Roll on the accuracy of the POG. Experimental results on a prototype infrared smartphone show that for an R-Roll angle of 90 ° , the new method achieves accuracy of approximately 1 ° , while a gaze estimation method that assumes that the R-Roll angle remains constant achieves an accuracy of 3.5 ° . The manner in which the experimental PoG estimation errors increase with the increase in the R-Roll angle was consistent with the analysis. The method presented in this paper can improve significantly the performance of eye-tracking systems on hand-held mobile-devices.


Author(s):  
Shujian Yu ◽  
Weihua Ou ◽  
Xinge You ◽  
Xiubao Jiang ◽  
Yun Zhu ◽  
...  
Keyword(s):  

Endoscopy ◽  
2018 ◽  
Vol 50 (07) ◽  
pp. 701-707 ◽  
Author(s):  
Mariam Lami ◽  
Harsimrat Singh ◽  
James Dilley ◽  
Hajra Ashraf ◽  
Matthew Edmondon ◽  
...  

Abstract Background The adenoma detection rate (ADR) is an important quality indicator in colonoscopy. The aim of this study was to evaluate the changes in visual gaze patterns (VGPs) with increasing polyp detection rate (PDR), a surrogate marker of ADR. Methods 18 endoscopists participated in the study. VGPs were measured using eye-tracking technology during the withdrawal phase of colonoscopy. VGPs were characterized using two analyses – screen and anatomy. Eye-tracking parameters were used to characterize performance, which was further substantiated using hidden Markov model (HMM) analysis. Results Subjects with higher PDRs spent more time viewing the outer ring of the 3 × 3 grid for both analyses (screen-based: r = 0.56, P = 0.02; anatomy: r = 0.62, P < 0.01). Fixation distribution to the “bottom U” of the screen in screen-based analysis was positively correlated with PDR (r = 0.62, P = 0.01). HMM demarcated the VGPs into three PDR groups. Conclusion This study defined distinct VGPs that are associated with expert behavior. These data may allow introduction of visual gaze training within structured training programs, and have implications for adoption in higher-level assessment.


Author(s):  
Stefan Kohlbecher ◽  
Stanislavs Bardinst ◽  
Klaus Bartl ◽  
Erich Schneider ◽  
Tony Poitschke ◽  
...  

IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 104207-104215 ◽  
Author(s):  
Meng Liu ◽  
Youfu Li ◽  
Hai Liu

Sensors ◽  
2020 ◽  
Vol 20 (2) ◽  
pp. 543 ◽  
Author(s):  
Braiden Brousseau ◽  
Jonathan Rose ◽  
Moshe Eizenman

This paper describes a low-cost, robust, and accurate remote eye-tracking system that uses an industrial prototype smartphone with integrated infrared illumination and camera. Numerous studies have demonstrated the beneficial use of eye-tracking in domains such as neurological and neuropsychiatric testing, advertising evaluation, pilot training, and automotive safety. Remote eye-tracking on a smartphone could enable the significant growth in the deployment of applications in these domains. Our system uses a 3D gaze-estimation model that enables accurate point-of-gaze (PoG) estimation with free head and device motion. To accurately determine the input eye features (pupil center and corneal reflections), the system uses Convolutional Neural Networks (CNNs) together with a novel center-of-mass output layer. The use of CNNs improves the system’s robustness to the significant variability in the appearance of eye-images found in handheld eye trackers. The system was tested with 8 subjects with the device free to move in their hands and produced a gaze bias of 0.72°. Our hybrid approach that uses artificial illumination, a 3D gaze-estimation model, and a CNN feature extractor achieved an accuracy that is significantly (400%) better than current eye-tracking systems on smartphones that use natural illumination and machine-learning techniques to estimate the PoG.


2015 ◽  
Vol 118 (2) ◽  
pp. 194-216 ◽  
Author(s):  
Kenneth A. Funes-Mora ◽  
Jean-Marc Odobez
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document