Building a Low-Cost Eye-Tracking System

2012 ◽  
Vol 263-266 ◽  
pp. 2399-2402
Author(s):  
Chi Wu Huang ◽  
Zong Sian Jiang ◽  
Wei Fan Kao ◽  
Yen Lin Huang

This paper presents the developing of a low-cost eye-tracking system by modifying the commercial-over-the-shelf camera to integrate with the proper-tuned open source drivers and the user-defined application programs. The system configuration is proposed and the gaze-tracking approximated by the least square polynomial mapping is described. Comparisons between other low-cost systems as well as commercial system are provided. Our system obtained the highest image capturing rate of 180 frames per second, and the ISO 9241-Part 9 test performance favored our system, in terms of Response time and Correct response rate. Currently, we are developing gaze-tracking accuracy application. The real time gaze-tracking and the Head Movement Estimation are the issues in future work.

Author(s):  
ARANTXA VILLANUEVA ◽  
RAFAEL CABEZA ◽  
SONIA PORTA

In the past years, research in eye tracking development and applications has attracted much attention and the possibility of interacting with a computer employing just gaze information is becoming more and more feasible. Efforts in eye tracking cover a broad spectrum of fields, system mathematical modeling being an important aspect in this research. Expressions relating to several elements and variables of the gaze tracker would lead to establish geometric relations and to find out symmetrical behaviors of the human eye when looking at a screen. To this end a deep knowledge of projective geometry as well as eye physiology and kinematics are basic. This paper presents a model for a bright-pupil technique tracker fully based on realistic parameters describing the system elements. The system so modeled is superior to that obtained with generic expressions based on linear or quadratic expressions. Moreover, model symmetry knowledge leads to more effective and simpler calibration strategies, resulting in just two calibration points needed to fit the optical axis and only three points to adjust the visual axis. Reducing considerably the time spent by other systems employing more calibration points renders a more attractive model.


2010 ◽  
Vol 36 (8) ◽  
pp. 1051-1061 ◽  
Author(s):  
Chuang ZHANG ◽  
Jian-Nan CHI ◽  
Zhao-Hui ZHANG ◽  
Zhi-Liang WANG

Vision ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 39
Author(s):  
Julie Royo ◽  
Fabrice Arcizet ◽  
Patrick Cavanagh ◽  
Pierre Pouget

We introduce a blind spot method to create image changes contingent on eye movements. One challenge of eye movement research is triggering display changes contingent on gaze. The eye-tracking system must capture the image of the eye, discover and track the pupil and corneal reflections to estimate the gaze position, and then transfer this data to the computer that updates the display. All of these steps introduce delays that are often difficult to predict. To avoid these issues, we describe a simple blind spot method to generate gaze contingent display manipulations without any eye-tracking system and/or display controls.


2022 ◽  
Vol 132 ◽  
pp. 01017
Author(s):  
Sangjip Ha ◽  
Eun-ju Yi ◽  
In-jin Yoo ◽  
Do-Hyung Park

This study intends to utilize eye tracking for the appearance of a robot, which is one of the trends in social robot design research. We suggest a research model with the entire stage from the consumer gaze response to the perceived consumer beliefs and further their attitudes toward social robots. Specifically, the eye tracking indicators used in this study are Fixation, First Visit, Total Viewed Stay Time, and Number of Revisits. Also, Areas of Interest are selected to the face, eyes, lips, and full-body of a social robot. In the first relationship, we check which element of the social robot design the consumer’s gaze stays on, and how the gaze on each element affects consumer beliefs. The consumer beliefs are considered as the social robot’s emotional expression, humanness, and facial prominence. Second, we explore whether the formation of consumer attitudes is possible through two major channels. One is the path that the consumer beliefs formed through the gaze influence their attitude, and the other is the path that the consumer gaze response directly influences the attitude. This study made a theoretical contribution in that it finally analysed the path of consumer attitude formation from various angles by linking the gaze tracking reaction and consumer perception. In addition, it is expected to make practical contributions in the suggestion of specific design insights that can be used as a reference for designing social robots.


Sensors ◽  
2020 ◽  
Vol 20 (2) ◽  
pp. 543 ◽  
Author(s):  
Braiden Brousseau ◽  
Jonathan Rose ◽  
Moshe Eizenman

This paper describes a low-cost, robust, and accurate remote eye-tracking system that uses an industrial prototype smartphone with integrated infrared illumination and camera. Numerous studies have demonstrated the beneficial use of eye-tracking in domains such as neurological and neuropsychiatric testing, advertising evaluation, pilot training, and automotive safety. Remote eye-tracking on a smartphone could enable the significant growth in the deployment of applications in these domains. Our system uses a 3D gaze-estimation model that enables accurate point-of-gaze (PoG) estimation with free head and device motion. To accurately determine the input eye features (pupil center and corneal reflections), the system uses Convolutional Neural Networks (CNNs) together with a novel center-of-mass output layer. The use of CNNs improves the system’s robustness to the significant variability in the appearance of eye-images found in handheld eye trackers. The system was tested with 8 subjects with the device free to move in their hands and produced a gaze bias of 0.72°. Our hybrid approach that uses artificial illumination, a 3D gaze-estimation model, and a CNN feature extractor achieved an accuracy that is significantly (400%) better than current eye-tracking systems on smartphones that use natural illumination and machine-learning techniques to estimate the PoG.


2021 ◽  
Vol 2120 (1) ◽  
pp. 012030
Author(s):  
J K Tan ◽  
W J Chew ◽  
S K Phang

Abstract The field of Human-Computer Interaction (HCI) has been developing tremendously since the past decade. The existence of smartphones or modern computers is already a norm in society these days which utilizes touch, voice and typing as a means for input. To further increase the variety of interaction, human eyes are set to be a good candidate for another form of HCI. The amount of information which the human eyes contain are extremely useful, hence, various methods and algorithm for eye gaze tracking are implemented in multiple sectors. However, some eye-tracking method requires infrared rays to be projected into the eye of the user which could potentially cause enzyme denaturation when the eye is subjected to those rays under extreme exposure. Therefore, to avoid potential harm from the eye-tracking method that utilizes infrared rays, this paper proposes an image-based eye tracking system using the Viola-Jones algorithm and Circular Hough Transform (CHT) algorithm. The proposed method uses visible light instead of infrared rays to control the mouse pointer using the eye gaze of the user. This research aims to implement the proposed algorithm for people with hand disability to interact with computers using their eye gaze.


2020 ◽  
Vol 12 (2) ◽  
pp. 43
Author(s):  
Mateusz Pomianek ◽  
Marek Piszczek ◽  
Marcin Maciejewski ◽  
Piotr Krukowski

This paper describes research on the stability of the MEMS mirror for use in eye tracking systems. MEMS mirrors are the main element in scanning methods (which is one of the methods of eye tracking). Due to changes in the mirror pitch, the system can scan the area of the eye with a laser and collect the signal reflected. However, this method works on the assumption that the inclinations are constant in each period. The instability of this causes errors. The aim of this work is to examine the error level caused by pitch instability at different points of work. Full Text: PDF ReferencesW. Fuhl, M. Tonsen, A. Bulling, and E. Kasneci, "Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art," Mach. Vis. Appl., vol. 27, no. 8, pp. 1275-1288, 2016, CrossRef X. Wang, S. Koch, K. Holmqvist, and M. Alexa, "Tracking the gaze on objects in 3D," ACM Trans. Graph., vol. 37, no. 6, pp. 1-18, Dec. 2018 CrossRef X. Xiong and H. Xie, "MEMS dual-mode electrostatically actuated micromirror," Proc. 2014 Zo. 1 Conf. Am. Soc. Eng. Educ. - "Engineering Educ. Ind. Involv. Interdiscip. Trends", ASEE Zo. 1 2014, no. Dmd, 2014 CrossRef E. Pengwang, K. Rabenorosoa, M. Rakotondrabe, and N. Andreff, "Scanning micromirror platform based on MEMS technology for medical application," Micromachines, vol. 7, no. 2, 2016 CrossRef J. P. Giannini, A. G. York, and H. Shroff, "Anticipating, measuring, and minimizing MEMS mirror scan error to improve laser scanning microscopy's speed and accuracy," PLoS One, vol. 12, no. 10, pp. 1-14, 2017 CrossRef C. Hennessey, B. Noureddin, and P. Lawrence, "A single camera eye-gaze tracking system with free head motion," Eye Track. Res. Appl. Symp., vol. 2005, no. March, pp. 87-94, 2005 CrossRef C. H. Morimoto and M. R. M. Mimica, "Eye gaze tracking techniques for interactive applications," Comput. Vis. Image Underst., vol. 98, no. 1, pp. 4-24, Apr. 2005 CrossRef S. T. S. Holmström, U. Baran, and H. Urey, "MEMS laser scanners: A review," J. Microelectromechanical Syst., vol. 23, no. 2, pp. 259-275, 2014 CrossRef C. W. Cho, "Gaze Detection by Wearable Eye-Tracking and NIR LED-Based Head-Tracking Device Based on SVR," ETRI J., vol. 34, no. 4, pp. 542-552, Aug. 2012 CrossRef T. Santini, W. Fuhl, and E. Kasneci, "PuRe: Robust pupil detection for real-time pervasive eye tracking," Comput. Vis. Image Underst., vol. 170, pp. 40-50, May 2018 CrossRef O. Solgaard, A. A. Godil, R. T. Howe, L. P. Lee, Y. A. Peter, and H. Zappe, "Optical MEMS: From micromirrors to complex systems," J. Microelectromechanical Syst., vol. 23, no. 3, pp. 517-538, 2014 CrossRef J. Wang, G. Zhang, and Z. You, "UKF-based MEMS micromirror angle estimation for LiDAR," J. Micromechanics Microengineering, vol. 29, no. 3, 201 CrossRef


Sign in / Sign up

Export Citation Format

Share Document