scholarly journals Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze

Author(s):  
Nikolaus Bee ◽  
Elisabeth André
2021 ◽  
Author(s):  
Fumihiro Kano ◽  
Takeshi Furuichi ◽  
Chie Hashimoto ◽  
Christopher Krupenye ◽  
Jesse G Leinwand ◽  
...  

The gaze-signaling hypothesis and the related cooperative-eye hypothesis posit that humans have evolved special external eye morphology, including exposed white sclera (the white of the eye), to enhance the visibility of eye-gaze direction and thereby facilitate conspecific communication through joint-attentional interaction and ostensive communication. However, recent quantitative studies questioned these hypotheses based on new findings that humans are not necessarily unique in certain eye features compared to other great ape species. Therefore, there is currently a heated debate on whether external eye features of humans are distinguished from those of other apes and how such distinguished features contribute to the visibility of eye-gaze direction. This study leveraged updated image analysis techniques to test the uniqueness of human eye features in facial images of great apes. Although many eye features were similar between humans and other species, a key difference was that humans have uniformly white sclera which creates clear visibility of both eye outline and iris; the two essential features contributing to the visibility of eye-gaze direction. We then tested the robustness of the visibility of these features against visual noises such as darkening and distancing and found that both eye features remain detectable in the human eye, while eye outline becomes barely detectable in other species under these visually challenging conditions. Overall, we identified that humans have distinguished external eye morphology among other great apes, which ensures robustness of eye-gaze signal against various visual conditions. Our results support and also critically update the central premises of the gaze-signaling hypothesis.


2021 ◽  
Vol 29 (1) ◽  
pp. 191-200
Author(s):  
Yuan JI ◽  
◽  
Yuan-sheng CHEN ◽  
Yuan-sheng SONG ◽  
Wen-dong CHEN ◽  
...  
Keyword(s):  
Eye Gaze ◽  

2019 ◽  
Vol 38 ◽  
pp. 100671 ◽  
Author(s):  
Jialiang Guo ◽  
Xiangsheng Luo ◽  
Encong Wang ◽  
Bingkun Li ◽  
Qinyuan Chang ◽  
...  

Author(s):  
Alexander L. Anwyl-Irvine ◽  
Thomas Armstrong ◽  
Edwin S. Dalmaijer

AbstractPsychological research is increasingly moving online, where web-based studies allow for data collection at scale. Behavioural researchers are well supported by existing tools for participant recruitment, and for building and running experiments with decent timing. However, not all techniques are portable to the Internet: While eye tracking works in tightly controlled lab conditions, webcam-based eye tracking suffers from high attrition and poorer quality due to basic limitations like webcam availability, poor image quality, and reflections on glasses and the cornea. Here we present MouseView.js, an alternative to eye tracking that can be employed in web-based research. Inspired by the visual system, MouseView.js blurs the display to mimic peripheral vision, but allows participants to move a sharp aperture that is roughly the size of the fovea. Like eye gaze, the aperture can be directed to fixate on stimuli of interest. We validated MouseView.js in an online replication (N = 165) of an established free viewing task (N = 83 existing eye-tracking datasets), and in an in-lab direct comparison with eye tracking in the same participants (N = 50). Mouseview.js proved as reliable as gaze, and produced the same pattern of dwell time results. In addition, dwell time differences from MouseView.js and from eye tracking correlated highly, and related to self-report measures in similar ways. The tool is open-source, implemented in JavaScript, and usable as a standalone library, or within Gorilla, jsPsych, and PsychoJS. In sum, MouseView.js is a freely available instrument for attention-tracking that is both reliable and valid, and that can replace eye tracking in certain web-based psychological experiments.


2021 ◽  
Vol 14 (2) ◽  
Author(s):  
Xin Liu ◽  
Bin Zheng ◽  
Xiaoqin Duan ◽  
Wenjing He ◽  
Yuandong Li ◽  
...  

Eye-tracking can help decode the intricate control mechanism in human performance. In healthcare, physicians-in-training requires extensive practice to improve their healthcare skills. When a trainee encounters any difficulty in the practice, they will need feedback from experts to improve their performance. The personal feedback is time-consuming and subjected to bias. In this study, we tracked the eye movements of trainees during their colonoscopic performance in simulation. We applied deep learning algorithms to detect the eye-tracking metrics on the moments of navigation lost (MNL), a signature sign for performance difficulty during colonoscopy. Basic human eye gaze and pupil characteristics were learned and verified by the deep convolutional generative adversarial networks (DCGANs); the generated data were fed to the Long Short-Term Memory (LSTM) networks with three different data feeding strategies to classify MNLs from the entire colonoscopic procedure. Outputs from deep learning were compared to the expert’s judgment on the MNLs based on colonoscopic videos. The best classification outcome was achieved when we fed human eye data with 1000 synthesized eye data, where accuracy (90%), sensitivity (90%), and specificity (88%) were optimized. This study built an important foundation for our work of developing a self-adaptive education system for training healthcare skills using simulation.


2019 ◽  
pp. 177
Author(s):  
Hanaa Mohsin Ahmed ◽  
Salma Hameedi Abdullah

2021 ◽  
pp. 1-20
Author(s):  
Jeevithashree DV ◽  
Puneet Jain ◽  
Abhishek Mukhopadhyay ◽  
Kamal Preet Singh Saluja ◽  
Pradipta Biswas

BACKGROUND: Users with Severe Speech and Motor Impairment (SSMI) often use a communication chart through their eye gaze or limited hand movement and care takers interpret their communication intent. There is already significant research conducted to automate this communication through electronic means. Developing electronic user interface and interaction techniques for users with SSMI poses significant challenges as research on their ocular parameters found that such users suffer from Nystagmus and Strabismus limiting number of elements in a computer screen. This paper presents an optimized eye gaze controlled virtual keyboard for English language with an adaptive dwell time feature for users with SSMI. OBJECTIVE: Present an optimized eye gaze controlled English virtual keyboard that follows both static and dynamic adaptation process. The virtual keyboard can automatically adapt to reduce eye gaze movement distance and dwell time for selection and help users with SSMI type better without any intervention of an assistant. METHODS: Before designing the virtual keyboard, we undertook a pilot study to optimize screen region which would be most comfortable for SSMI users to operate. We then proposed an optimized two-level English virtual keyboard layout through Genetic algorithm using static adaptation process; followed by dynamic adaptation process which tracks users’ interaction and reduces dwell time based on a Markov model-based algorithm. Further, we integrated the virtual keyboard for a web-based interactive dashboard that visualizes real-time Covid data. RESULTS: Using our proposed virtual keyboard layout for English language, the average task completion time for users with SSMI was 39.44 seconds in adaptive condition and 29.52 seconds in non-adaptive condition. Overall typing speed was 16.9 lpm (letters per minute) for able-bodied users and 6.6 lpm for users with SSMI without using any word completion or prediction features. A case study with an elderly participant with SSMI found a typing speed of 2.70 wpm (words per minute) and 14.88 lpm (letters per minute) after 6 months of practice. CONCLUSIONS: With the proposed layout for English virtual keyboard, the adaptive system increased typing speed statistically significantly for able bodied users than a non-adaptive version while for 6 users with SSMI, task completion time reduced by 8.8% in adaptive version than nonadaptive one. Additionally, the proposed layout was successfully integrated to a web-based interactive visualization dashboard thereby making it accessible for users with SSMI.


Sign in / Sign up

Export Citation Format

Share Document