GazePointer: A real time mouse pointer control implementation based on eye gaze tracking

INMIC ◽  
2013 ◽  
Author(s):  
Muhammad Usman Ghani ◽  
Sarah Chaudhry ◽  
Maryam Sohail ◽  
Muhammad Nafees Geelani
NeuroImage ◽  
2020 ◽  
Vol 216 ◽  
pp. 116617 ◽  
Author(s):  
Hyun-Chul Kim ◽  
Sangsoo Jin ◽  
Sungman Jo ◽  
Jong-Hwan Lee

2016 ◽  
Vol 24 ◽  
pp. 5162-5172 ◽  
Author(s):  
Nesrin AYDIN ATASOY ◽  
Abdullah ÇAVUŞOĞLU ◽  
Ferhat ATASOY

2021 ◽  
Vol 2120 (1) ◽  
pp. 012030
Author(s):  
J K Tan ◽  
W J Chew ◽  
S K Phang

Abstract The field of Human-Computer Interaction (HCI) has been developing tremendously since the past decade. The existence of smartphones or modern computers is already a norm in society these days which utilizes touch, voice and typing as a means for input. To further increase the variety of interaction, human eyes are set to be a good candidate for another form of HCI. The amount of information which the human eyes contain are extremely useful, hence, various methods and algorithm for eye gaze tracking are implemented in multiple sectors. However, some eye-tracking method requires infrared rays to be projected into the eye of the user which could potentially cause enzyme denaturation when the eye is subjected to those rays under extreme exposure. Therefore, to avoid potential harm from the eye-tracking method that utilizes infrared rays, this paper proposes an image-based eye tracking system using the Viola-Jones algorithm and Circular Hough Transform (CHT) algorithm. The proposed method uses visible light instead of infrared rays to control the mouse pointer using the eye gaze of the user. This research aims to implement the proposed algorithm for people with hand disability to interact with computers using their eye gaze.


Sign in / Sign up

Export Citation Format

Share Document