A novel 2-D mapping-based remote eye gaze tracking method using two IR light sources

Author(s):  
Yong-Goo Shin ◽  
Kang-A Choi ◽  
Sung-Tae Kim ◽  
Cheol-Hwan Yoo ◽  
Sung-Jea Ko
2020 ◽  
Vol 1518 ◽  
pp. 012020
Author(s):  
Shengfu Lu ◽  
Richeng Li ◽  
Jinan Jiao ◽  
Jiaming Kang ◽  
Nana Zhao ◽  
...  

2021 ◽  
Vol 2120 (1) ◽  
pp. 012030
Author(s):  
J K Tan ◽  
W J Chew ◽  
S K Phang

Abstract The field of Human-Computer Interaction (HCI) has been developing tremendously since the past decade. The existence of smartphones or modern computers is already a norm in society these days which utilizes touch, voice and typing as a means for input. To further increase the variety of interaction, human eyes are set to be a good candidate for another form of HCI. The amount of information which the human eyes contain are extremely useful, hence, various methods and algorithm for eye gaze tracking are implemented in multiple sectors. However, some eye-tracking method requires infrared rays to be projected into the eye of the user which could potentially cause enzyme denaturation when the eye is subjected to those rays under extreme exposure. Therefore, to avoid potential harm from the eye-tracking method that utilizes infrared rays, this paper proposes an image-based eye tracking system using the Viola-Jones algorithm and Circular Hough Transform (CHT) algorithm. The proposed method uses visible light instead of infrared rays to control the mouse pointer using the eye gaze of the user. This research aims to implement the proposed algorithm for people with hand disability to interact with computers using their eye gaze.


2008 ◽  
Vol 20 (5) ◽  
pp. 319-337 ◽  
Author(s):  
Eui Chul Lee ◽  
Kang Ryoung Park

2015 ◽  
Vol 63 (4) ◽  
pp. 879-886 ◽  
Author(s):  
A. Wojciechowski ◽  
K. Fornalczyk

Abstract Eye-gaze tracking is an aspect of human-computer interaction still growing in popularity,. Tracking human gaze point can help control user interfaces and may help evaluate graphical user interfaces. At the same time professional eye-trackers are very expensive and thus unavailable for most of user interface researchers and small companies. The paper presents very effective, low cost, computer vision based, interactive eye-gaze tracking method. On contrary to other authors results the method achieves very high precision (about 1.5 deg horizontally and 2.5 deg vertically) at 20 fps performance, exploiting a simple HD web camera with reasonable environment restrictions. The paper describes the algorithms used in the eye-gaze tracking method and results of experimental tests, both static absolute point of interest estimation, and dynamic functional gaze controlled cursor steering.


2010 ◽  
Vol 22 (03) ◽  
pp. 185-192 ◽  
Author(s):  
Jin-Yu Chu ◽  
Jian-De Sun ◽  
Xiao-Hui Yang ◽  
Ju Liu ◽  
Wei Liu

The gaze tracking system has become an active research field for handicapped persons as well as general people in recent years. The precise mapping method plays an important role in the system. In this paper, a novel infrared gaze tracking system based on nonuniform interpolation is proposed. In this system, the eye images for the computer to analyze are extracted under two infrared light sources and a charge-coupled device camera, and the users do not require wearing any device. First, the integral projection algorithm and canny edge detection are applied to extract the pupil boundary points from the captured eye images, and then the pupil center is computed using an efficient and accurate ellipse-fitting algorithm. Finally, to estimate where the user looks, a novel mapping method based on the nonuniform interpolation algorithm is proposed. In this mapping method, the complicated geometric eyeball model and the nonlinear mapping between the pupil center coordinates and computer monitor screen coordinates do not need to be taken into account. Experimental results show that the proposed mapping method is simple, fast and more accurate. Moreover, our system is the remote eye gaze tracking system. The users do not need to wear any device, which make the users feel more comfortable.


2016 ◽  
Vol 55 (8) ◽  
pp. 083108
Author(s):  
Sung-Tae Kim ◽  
Kang-A Choi ◽  
Yong-Goo Shin ◽  
Mun-Cheon Kang ◽  
Sung-Jea Ko

Sign in / Sign up

Export Citation Format

Share Document