scholarly journals Ultrasound for Gaze Estimation—A Modeling and Empirical Study

Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4502
Author(s):  
Andre Golard ◽  
Sachin S. Talathi

Most eye tracking methods are light-based. As such, they can suffer from ambient light changes when used outdoors, especially for use cases where eye trackers are embedded in Augmented Reality glasses. It has been recently suggested that ultrasound could provide a low power, fast, light-insensitive alternative to camera-based sensors for eye tracking. Here, we report on our work on modeling ultrasound sensor integration into a glasses form factor AR device to evaluate the feasibility of estimating eye-gaze in various configurations. Next, we designed a benchtop experimental setup to collect empirical data on time of flight and amplitude signals for reflected ultrasound waves for a range of gaze angles of a model eye. We used this data as input for a low-complexity gradient-boosted tree machine learning regression model and demonstrate that we can effectively estimate gaze (gaze RMSE error of 0.965 ± 0.178 degrees with an adjusted R2 score of 90.2 ± 4.6).

Author(s):  
I Ketut Gede Darma Putra ◽  
Agung Cahyawan ◽  
Yandi Perdana

2018 ◽  
Vol 14 (2) ◽  
pp. 153-173 ◽  
Author(s):  
Jumana Waleed ◽  
◽  
Taha Mohammed Hasan ◽  
Qutaiba Kadhim Abed

2015 ◽  
Vol 1 (6) ◽  
pp. 276
Author(s):  
Maria Rashid ◽  
Wardah Mehmood ◽  
Aliya Ashraf

Eye movement tracking is a method that is now-a-days used for checking the usability problems in the contexts of Human Computer Interaction (HCI). Firstly we present eye tracking technology and key elements.We tend to evaluate the behavior of the use when they are using the interace of eye gaze. Used different techniques i.e. electro-oculography, infrared oculography, video oculography, image process techniques, scrolling techniques, different models, probable approaches i.e. shape based approach, appearance based methods, 2D and 3D models based approach and different software algorithms for pupil detection etc. We have tried to compare the surveys based on their geometric properties and reportable accuracies and eventually we conclude this study by giving some prediction regarding future eye-gaze. We point out some techniques by using various eyes properties comprising nature, appearance and gesture or some combination for eye tracking and detection. Result displays eye-gaze technique is faster and better approach for selection than a mouse selection. Rate of error for all the matters determines that there have been no errors once choosing from main menus with eye mark and with mouse. But there have been a chance of errors when once choosing from sub menus in case of eye mark. So, maintain head constantly in front of eye gaze monitor.


Author(s):  
Federico Cassioli ◽  
Laura Angioletti ◽  
Michela Balconi

AbstractHuman–computer interaction (HCI) is particularly interesting because full-immersive technology may be approached differently by users, depending on the complexity of the interaction, users’ personality traits, and their motivational systems inclination. Therefore, this study investigated the relationship between psychological factors and attention towards specific tech-interactions in a smart home system (SHS). The relation between personal psychological traits and eye-tracking metrics is investigated through self-report measures [locus of control (LoC), user experience (UX), behavioral inhibition system (BIS) and behavioral activation system (BAS)] and a wearable and wireless near-infrared illumination based eye-tracking system applied to an Italian sample (n = 19). Participants were asked to activate and interact with five different tech-interaction areas with different levels of complexity (entrance, kitchen, living room, bathroom, and bedroom) in a smart home system (SHS), while their eye-gaze behavior was recorded. Data showed significant differences between a simpler interaction (entrance) and a more complex one (living room), in terms of number of fixation. Moreover, slower time to first fixation in a multifaceted interaction (bathroom), compared to simpler ones (kitchen and living room) was found. Additionally, in two interaction conditions (living room and bathroom), negative correlations were found between external LoC and fixation count, and between BAS reward responsiveness scores and fixation duration. Findings led to the identification of a two-way process, where both the complexity of the tech-interaction and subjects’ personality traits are important impacting factors on the user’s visual exploration behavior. This research contributes to understand the user responsiveness adding first insights that may help to create more human-centered technology.


Vision ◽  
2018 ◽  
Vol 2 (3) ◽  
pp. 35 ◽  
Author(s):  
Braiden Brousseau ◽  
Jonathan Rose ◽  
Moshe Eizenman

The most accurate remote Point of Gaze (PoG) estimation methods that allow free head movements use infrared light sources and cameras together with gaze estimation models. Current gaze estimation models were developed for desktop eye-tracking systems and assume that the relative roll between the system and the subjects’ eyes (the ’R-Roll’) is roughly constant during use. This assumption is not true for hand-held mobile-device-based eye-tracking systems. We present an analysis that shows the accuracy of estimating the PoG on screens of hand-held mobile devices depends on the magnitude of the R-Roll angle and the angular offset between the visual and optical axes of the individual viewer. We also describe a new method to determine the PoG which compensates for the effects of R-Roll on the accuracy of the POG. Experimental results on a prototype infrared smartphone show that for an R-Roll angle of 90 ° , the new method achieves accuracy of approximately 1 ° , while a gaze estimation method that assumes that the R-Roll angle remains constant achieves an accuracy of 3.5 ° . The manner in which the experimental PoG estimation errors increase with the increase in the R-Roll angle was consistent with the analysis. The method presented in this paper can improve significantly the performance of eye-tracking systems on hand-held mobile-devices.


Sign in / Sign up

Export Citation Format

Share Document