Iris Center Corneal Reflection Method for Gaze Tracking Using Visible Light

2011 ◽  
Vol 58 (2) ◽  
pp. 411-419 ◽  
Author(s):  
Jose Sigut ◽  
Sid-Ahmed Sidha
2021 ◽  
Vol 11 (2) ◽  
pp. 851
Author(s):  
Wei-Liang Ou ◽  
Tzu-Ling Kuo ◽  
Chin-Chieh Chang ◽  
Chih-Peng Fan

In this study, for the application of visible-light wearable eye trackers, a pupil tracking methodology based on deep-learning technology is developed. By applying deep-learning object detection technology based on the You Only Look Once (YOLO) model, the proposed pupil tracking method can effectively estimate and predict the center of the pupil in the visible-light mode. By using the developed YOLOv3-tiny-based model to test the pupil tracking performance, the detection accuracy is as high as 80%, and the recall rate is close to 83%. In addition, the average visible-light pupil tracking errors of the proposed YOLO-based deep-learning design are smaller than 2 pixels for the training mode and 5 pixels for the cross-person test, which are much smaller than those of the previous ellipse fitting design without using deep-learning technology under the same visible-light conditions. After the combination of calibration process, the average gaze tracking errors by the proposed YOLOv3-tiny-based pupil tracking models are smaller than 2.9 and 3.5 degrees at the training and testing modes, respectively, and the proposed visible-light wearable gaze tracking system performs up to 20 frames per second (FPS) on the GPU-based software embedded platform.


2016 ◽  
Vol 2016 ◽  
pp. 1-14 ◽  
Author(s):  
Onur Ferhat ◽  
Fernando Vilariño

Despite the availability of accurate, commercial gaze tracker devices working with infrared (IR) technology, visible light gaze tracking constitutes an interesting alternative by allowing scalability and removing hardware requirements. Over the last years, this field has seen examples of research showing performance comparable to the IR alternatives. In this work, we survey the previous work on remote, visible light gaze trackers and analyze the explored techniques from various perspectives such as calibration strategies, head pose invariance, and gaze estimation techniques. We also provide information on related aspects of research such as public datasets to test against, open source projects to build upon, and gaze tracking services to directly use in applications. With all this information, we aim to provide the contemporary and future researchers with a map detailing previously explored ideas and the required tools.


2013 ◽  
Vol 655-657 ◽  
pp. 1066-1076 ◽  
Author(s):  
Bo Zhu ◽  
Peng Yun Zhang ◽  
Jian Nan Chi ◽  
Tian Xia Zhang

A new gaze tracking method used in single camera gaze tracking system is proposed. The method can be divided into human face and eye location, human features detection and gaze parameters extraction, and ELM based gaze point estimation. In face and eye location, a face detection method which combines skin color model with Adaboost method is used for fast human face detection. In eye features and gaze parameters extraction, many image processing methods are used to detect eye features such as iris center, inner eye corner and so on. And then gaze parameter which is the vector from iris center to eye corner is obtained. After above an ELM based gaze point on the screen estimation method is proposed to establish the mapping relationship between gaze parameter and gaze point. The experimental results illustrate that the method in this paper is effective to do gaze estimation in single camera gaze tracking system.


2013 ◽  
Vol 59 (2) ◽  
pp. 415-421 ◽  
Author(s):  
Seung-Jin Baek ◽  
Kang-A Choi ◽  
Chunfei Ma ◽  
Young-Hyun Kim ◽  
Sung-Jea Ko

Author(s):  
Chi Jian-nan ◽  
Zhang Peng-yi ◽  
Zheng Si-yi ◽  
Zhang Chuang ◽  
Huang Ying

2018 ◽  
Vol 11 (4) ◽  
Author(s):  
Feng Xiao ◽  
Dandan Zheng ◽  
Kejie Huang ◽  
Yue Qiu ◽  
Haibin Shen

Gaze tracking is a human-computer interaction technology, and it has been widely studied in the academic and industrial fields. However, constrained by the performance of the specific sensors and algorithms, it has not been popularized for everyone. This paper proposes a single-camera gaze tracking system under natural light to enable its versatility. The iris center and anchor point are the most crucial factors for the accuracy of the system. The accurate iris center is detected by the simple active contour snakuscule, which is initialized by the prior knowledge of eye anatomical dimensions. After that, a novel anchor point is computed by the stable facial landmarks. Next, second-order mapping functions use the eye vectors and the head pose to estimate the points of regard. Finally, the gaze errors are improved by implementing a weight coefficient on the points of regard of the left and right eyes. The feature position of the iris center achieves an accuracy of 98.87% on the GI4E database when the normalized error is lower than 0.05. The accuracy of the gaze tracking method is superior to the-state-of-the-art appearance-based and feature-based methods on the EYEDIAP database.


Sign in / Sign up

Export Citation Format

Share Document