scholarly journals GAZE TRACKING SYSTEM MODEL BASED ON PHYSICAL PARAMETERS

Author(s):  
ARANTXA VILLANUEVA ◽  
RAFAEL CABEZA ◽  
SONIA PORTA

In the past years, research in eye tracking development and applications has attracted much attention and the possibility of interacting with a computer employing just gaze information is becoming more and more feasible. Efforts in eye tracking cover a broad spectrum of fields, system mathematical modeling being an important aspect in this research. Expressions relating to several elements and variables of the gaze tracker would lead to establish geometric relations and to find out symmetrical behaviors of the human eye when looking at a screen. To this end a deep knowledge of projective geometry as well as eye physiology and kinematics are basic. This paper presents a model for a bright-pupil technique tracker fully based on realistic parameters describing the system elements. The system so modeled is superior to that obtained with generic expressions based on linear or quadratic expressions. Moreover, model symmetry knowledge leads to more effective and simpler calibration strategies, resulting in just two calibration points needed to fit the optical axis and only three points to adjust the visual axis. Reducing considerably the time spent by other systems employing more calibration points renders a more attractive model.

2012 ◽  
Vol 263-266 ◽  
pp. 2399-2402
Author(s):  
Chi Wu Huang ◽  
Zong Sian Jiang ◽  
Wei Fan Kao ◽  
Yen Lin Huang

This paper presents the developing of a low-cost eye-tracking system by modifying the commercial-over-the-shelf camera to integrate with the proper-tuned open source drivers and the user-defined application programs. The system configuration is proposed and the gaze-tracking approximated by the least square polynomial mapping is described. Comparisons between other low-cost systems as well as commercial system are provided. Our system obtained the highest image capturing rate of 180 frames per second, and the ISO 9241-Part 9 test performance favored our system, in terms of Response time and Correct response rate. Currently, we are developing gaze-tracking accuracy application. The real time gaze-tracking and the Head Movement Estimation are the issues in future work.


2010 ◽  
Vol 36 (8) ◽  
pp. 1051-1061 ◽  
Author(s):  
Chuang ZHANG ◽  
Jian-Nan CHI ◽  
Zhao-Hui ZHANG ◽  
Zhi-Liang WANG

Vision ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 39
Author(s):  
Julie Royo ◽  
Fabrice Arcizet ◽  
Patrick Cavanagh ◽  
Pierre Pouget

We introduce a blind spot method to create image changes contingent on eye movements. One challenge of eye movement research is triggering display changes contingent on gaze. The eye-tracking system must capture the image of the eye, discover and track the pupil and corneal reflections to estimate the gaze position, and then transfer this data to the computer that updates the display. All of these steps introduce delays that are often difficult to predict. To avoid these issues, we describe a simple blind spot method to generate gaze contingent display manipulations without any eye-tracking system and/or display controls.


2022 ◽  
Vol 132 ◽  
pp. 01017
Author(s):  
Sangjip Ha ◽  
Eun-ju Yi ◽  
In-jin Yoo ◽  
Do-Hyung Park

This study intends to utilize eye tracking for the appearance of a robot, which is one of the trends in social robot design research. We suggest a research model with the entire stage from the consumer gaze response to the perceived consumer beliefs and further their attitudes toward social robots. Specifically, the eye tracking indicators used in this study are Fixation, First Visit, Total Viewed Stay Time, and Number of Revisits. Also, Areas of Interest are selected to the face, eyes, lips, and full-body of a social robot. In the first relationship, we check which element of the social robot design the consumer’s gaze stays on, and how the gaze on each element affects consumer beliefs. The consumer beliefs are considered as the social robot’s emotional expression, humanness, and facial prominence. Second, we explore whether the formation of consumer attitudes is possible through two major channels. One is the path that the consumer beliefs formed through the gaze influence their attitude, and the other is the path that the consumer gaze response directly influences the attitude. This study made a theoretical contribution in that it finally analysed the path of consumer attitude formation from various angles by linking the gaze tracking reaction and consumer perception. In addition, it is expected to make practical contributions in the suggestion of specific design insights that can be used as a reference for designing social robots.


Author(s):  
Evelyn P. Rozanski ◽  
Keith S. Karn ◽  
Anne R. Haake ◽  
Anthony M. Vigliotti ◽  
Jeff B. Pelz

Identifying problems and generating recommendations for product user interface redesign are primary goals of usability testing. Typical methods seem inadequate for the deep understanding of usability problems needed for developing effective solutions. Sporadically over the past 50 years, usability teams have tracked user eye movements to achieve this deeper understanding, but high cost and complexity have prevented the widespread use of this technology. We investigated whether simplified eye tracking techniques, in combination with traditional usability testing methods, could enhance problem discovery and understanding. These techniques included: using a video-based eye tracking system, tracking only a few participants, and encoding gaze durations (not individual fixations) on only a few areas of interest. For each of three interface versions, we studied twelve participants with traditional usability testing techniques and eye tracked just two. Eye tracking yielded discovery of additional usability problems and detailed characterizations which led to more focused and appropriate solutions.


2021 ◽  
Vol 2120 (1) ◽  
pp. 012030
Author(s):  
J K Tan ◽  
W J Chew ◽  
S K Phang

Abstract The field of Human-Computer Interaction (HCI) has been developing tremendously since the past decade. The existence of smartphones or modern computers is already a norm in society these days which utilizes touch, voice and typing as a means for input. To further increase the variety of interaction, human eyes are set to be a good candidate for another form of HCI. The amount of information which the human eyes contain are extremely useful, hence, various methods and algorithm for eye gaze tracking are implemented in multiple sectors. However, some eye-tracking method requires infrared rays to be projected into the eye of the user which could potentially cause enzyme denaturation when the eye is subjected to those rays under extreme exposure. Therefore, to avoid potential harm from the eye-tracking method that utilizes infrared rays, this paper proposes an image-based eye tracking system using the Viola-Jones algorithm and Circular Hough Transform (CHT) algorithm. The proposed method uses visible light instead of infrared rays to control the mouse pointer using the eye gaze of the user. This research aims to implement the proposed algorithm for people with hand disability to interact with computers using their eye gaze.


2020 ◽  
Vol 12 (2) ◽  
pp. 43
Author(s):  
Mateusz Pomianek ◽  
Marek Piszczek ◽  
Marcin Maciejewski ◽  
Piotr Krukowski

This paper describes research on the stability of the MEMS mirror for use in eye tracking systems. MEMS mirrors are the main element in scanning methods (which is one of the methods of eye tracking). Due to changes in the mirror pitch, the system can scan the area of the eye with a laser and collect the signal reflected. However, this method works on the assumption that the inclinations are constant in each period. The instability of this causes errors. The aim of this work is to examine the error level caused by pitch instability at different points of work. Full Text: PDF ReferencesW. Fuhl, M. Tonsen, A. Bulling, and E. Kasneci, "Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art," Mach. Vis. Appl., vol. 27, no. 8, pp. 1275-1288, 2016, CrossRef X. Wang, S. Koch, K. Holmqvist, and M. Alexa, "Tracking the gaze on objects in 3D," ACM Trans. Graph., vol. 37, no. 6, pp. 1-18, Dec. 2018 CrossRef X. Xiong and H. Xie, "MEMS dual-mode electrostatically actuated micromirror," Proc. 2014 Zo. 1 Conf. Am. Soc. Eng. Educ. - "Engineering Educ. Ind. Involv. Interdiscip. Trends", ASEE Zo. 1 2014, no. Dmd, 2014 CrossRef E. Pengwang, K. Rabenorosoa, M. Rakotondrabe, and N. Andreff, "Scanning micromirror platform based on MEMS technology for medical application," Micromachines, vol. 7, no. 2, 2016 CrossRef J. P. Giannini, A. G. York, and H. Shroff, "Anticipating, measuring, and minimizing MEMS mirror scan error to improve laser scanning microscopy's speed and accuracy," PLoS One, vol. 12, no. 10, pp. 1-14, 2017 CrossRef C. Hennessey, B. Noureddin, and P. Lawrence, "A single camera eye-gaze tracking system with free head motion," Eye Track. Res. Appl. Symp., vol. 2005, no. March, pp. 87-94, 2005 CrossRef C. H. Morimoto and M. R. M. Mimica, "Eye gaze tracking techniques for interactive applications," Comput. Vis. Image Underst., vol. 98, no. 1, pp. 4-24, Apr. 2005 CrossRef S. T. S. Holmström, U. Baran, and H. Urey, "MEMS laser scanners: A review," J. Microelectromechanical Syst., vol. 23, no. 2, pp. 259-275, 2014 CrossRef C. W. Cho, "Gaze Detection by Wearable Eye-Tracking and NIR LED-Based Head-Tracking Device Based on SVR," ETRI J., vol. 34, no. 4, pp. 542-552, Aug. 2012 CrossRef T. Santini, W. Fuhl, and E. Kasneci, "PuRe: Robust pupil detection for real-time pervasive eye tracking," Comput. Vis. Image Underst., vol. 170, pp. 40-50, May 2018 CrossRef O. Solgaard, A. A. Godil, R. T. Howe, L. P. Lee, Y. A. Peter, and H. Zappe, "Optical MEMS: From micromirrors to complex systems," J. Microelectromechanical Syst., vol. 23, no. 3, pp. 517-538, 2014 CrossRef J. Wang, G. Zhang, and Z. You, "UKF-based MEMS micromirror angle estimation for LiDAR," J. Micromechanics Microengineering, vol. 29, no. 3, 201 CrossRef


Author(s):  
Sinh Huynh ◽  
Rajesh Krishna Balan ◽  
JeongGil Ko

Gaze tracking is a key building block used in many mobile applications including entertainment, personal productivity, accessibility, medical diagnosis, and visual attention monitoring. In this paper, we present iMon, an appearance-based gaze tracking system that is both designed for use on mobile phones and has significantly greater accuracy compared to prior state-of-the-art solutions. iMon achieves this by comprehensively considering the gaze estimation pipeline and then overcoming three different sources of errors. First, instead of assuming that the user's gaze is fixed to a single 2D coordinate, we construct each gaze label using a probabilistic 2D heatmap gaze representation input to overcome errors caused by microsaccade eye motions that cause the exact gaze point to be uncertain. Second, we design an image enhancement model to refine visual details and remove motion blur effects of input eye images. Finally, we apply a calibration scheme to correct for differences between the perceived and actual gaze points caused by individual Kappa angle differences. With all these improvements, iMon achieves a person-independent per-frame tracking error of 1.49 cm (on smartphones) and 1.94 cm (on tablets) when tested with the GazeCapture dataset and 2.01 cm with the TabletGaze dataset. This outperforms the previous state-of-the-art solutions by ~22% to 28%. By averaging multiple per-frame estimations that belong to the same fixation point and applying personal calibration, the tracking error is further reduced to 1.11 cm (smartphones) and 1.59 cm (tablets). Finally, we built implementations that run on an iPhone 12 Pro and show that our mobile implementation of iMon can run at up to 60 frames per second - thus making gaze-based control of applications possible.


Sign in / Sign up

Export Citation Format

Share Document