scholarly journals A real-time robust eye tracking system for autostereoscopic displays using stereo cameras

Author(s):  
Chan-Hung Su ◽  
Yong-Sheng Chen ◽  
Yi-Ping Hung ◽  
Chu-Song Chen ◽  
Jiun-Hung Chen
2021 ◽  
pp. 112972982098736
Author(s):  
Kaji Tatsuru ◽  
Yano Keisuke ◽  
Onishi Shun ◽  
Matsui Mayu ◽  
Nagano Ayaka ◽  
...  

Purpose: Real-time ultrasound (RTUS)-guided central venipuncture using the short-axis approach is complicated and likely to result in losing sight of the needle tip. Therefore, we focused on the eye gaze in our evaluation of the differences in eye gaze between medical students and experienced participants using an eye tracking system. Methods: Ten medical students (MS group), five residents (R group) and six pediatric surgeon fellows (F group) performed short-axis RTUS-guided venipuncture simulation using a modified vessel training system. The eye gaze was captured by the tracking system (Tobii Eye Tacker 4C) and recorded. The evaluation endpoints were the task completion time, total time and number of occurrences of the eye tracking marker outside US monitor and success rate of venipuncture. Result: There were no significant differences in the task completion time and total time of the tracking marker outside the US monitor. The number of occurrences of the eye tracking marker outside US monitor in the MS group was significantly higher than in the F group (MS group: 9.5 ± 3.4, R group: 6.0 ± 2.9, F group: 5.2 ± 1.6; p  = 0.04). The success rate of venipuncture in the R group tended to be better than in the F group. Conclusion: More experienced operators let their eye fall outside the US monitor fewer times than less experienced ones. The eye gaze was associated with the success rate of RTUS-guided venipuncture. Repeated training while considering the eye gaze seems to be pivotal for mastering RTUS-guided venipuncture.


2015 ◽  
Vol 42 (5) ◽  
pp. 2194-2202 ◽  
Author(s):  
Riccardo Via ◽  
Aurora Fassi ◽  
Giovanni Fattori ◽  
Giulia Fontana ◽  
Andrea Pella ◽  
...  

Author(s):  
Jens-Patrick Langstrand ◽  
Hoa T. Nguyen ◽  
Michael Hildebrandt

Synopticon is a software platform that fuses data from position tracking, eye tracking, and physiological sensors. Synopticon was developed to produce real-time digital representations of users. These “digital twins” can be visualized, or used by other algorithms to detect the behavioural, cognitive or emotional state of the user. Synopticon provides 3D modelling tools based on position tracking data to define areas of interest (AOI) in the environment. By projecting the combined eye-and position-data into the 3D model, Synopticon can automatically detect when a user is looking at an AOI, generates real-time heat maps, and compiles statistical information. The demonstration will show how to set up and calibrate a combined position tracking and eye tracking system, and explain how Synopticon addresses some of the limitations of current eye tracking technology.


Sign in / Sign up

Export Citation Format

Share Document