scholarly journals Gaze-Based Assistive Technologies

Author(s):  
Thies Pfeiffer

The eyes play an important role both in perception and communication. Technical interfaces that make use of their versatility can bring significant improvements to those who are unable to speak or to handle selection tasks elsewise such as with their hands, feet, noses or tools handled with the mouth. Using the eyes to enter texts into a computer system, which is called gaze-typing, is the most prominent gaze-based assistive technology. The article reviews the principles of eye movements, presents an overview of current eye-tracking systems, and discusses several approaches to gaze-typing. With the recent advent of mobile eye-tracking systems, gaze-based assistive technology is no longer restricted to interactions with desktop-computers. Gaze-based assistive technology is ready to expand its application into other areas of everyday life. The second part of the article thus discusses the use of gaze-based assistive technology in the household, or “the wild,” outside one’s own four walls.

2017 ◽  
pp. 44-66 ◽  
Author(s):  
Thies Pfeiffer

The eyes play an important role both in perception and communication. Technical interfaces that make use of their versatility can bring significant improvements to those who are unable to speak or to handle selection tasks elsewise such as with their hands, feet, noses or tools handled with the mouth. Using the eyes to enter texts into a computer system, which is called gaze-typing, is the most prominent gaze-based assistive technology. The article reviews the principles of eye movements, presents an overview of current eye-tracking systems, and discusses several approaches to gaze-typing. With the recent advent of mobile eye-tracking systems, gaze-based assistive technology is no longer restricted to interactions with desktop-computers. Gaze-based assistive technology is ready to expand its application into other areas of everyday life. The second part of the article thus discusses the use of gaze-based assistive technology in the household, or “the wild,” outside one's own four walls.


Author(s):  
Amelia McCurley ◽  
Dan Nathan-Roberts

Eye-tracking technologies for computer navigation assist those with mobility and verbal impairments in interacting with the world around them. Studies found that these technologies have a profound positive impact on the lives of these individuals and those who care for them. This proceeding explores the popular features of eye-tracking systems and the challenges they present. Specifically, this proceeding evaluates camera configurations, camera types, and command selections through the lenses of cost, ease of use, and comfort. Current and future design guidelines based on these criteria are recommended to best aid individuals with these impairments.


2019 ◽  
Vol 12 (7) ◽  
Author(s):  
Ignace T.C. Hooge ◽  
Roy S. Hessels ◽  
Diederick C. Niehorster ◽  
Gabriel J. Diaz ◽  
Andrew T. Duchowski ◽  
...  

Video stream: https://vimeo.com/357473408 Wearable mobile eye trackers have great potential as they allow the measurement of eye movements during daily activities such as driving, navigating the world and doing groceries. Although mobile eye trackers have been around for some time, developing and operating these eye trackers was generally a highly technical affair. As such, mobile eye-tracking research was not feasible for most labs. Nowadays, many mobile eye trackers are available from eye-tracking manufacturers (e.g. Tobii, Pupil labs, SMI, Ergoneers) and various implementations in virtual/augmented reality have recently been released.The wide availability has caused the number of publications using a mobile eye tracker to increase quickly. Mobile eye tracking is now applied in vision science, educational science, developmental psychology, marketing research (using virtual and real supermarkets), clinical psychology, usability, architecture, medicine, and more. Yet, transitioning from lab-based studies where eye trackers are fixed to the world to studies where eye trackers are fixed to the head presents researchers with a number of problems. These problems range from the conceptual frameworks used in world-fixed and head-fixed eye tracking and how they relate to each other, to the lack of data quality comparisons and field tests of the different mobile eye trackers and how the gaze signal can be classified or mapped to the visual stimulus. Such problems need to be addressed in order to understand how world-fixed and head-fixed eye-tracking research can be compared and to understand the full potential and limits of what mobile eye-tracking can deliver. In this symposium, we bring together presenting researchers from five different institutions (Lund University, Utrecht University, Clemson University, Birkbeck University of London and Rochester Institute of Technology) addressing problems and innovative solutions across the entire breadth of mobile eye-tracking research. Hooge, presenting Hessels et al. paper, focus on the definitions of fixations and saccades held by researchers in the eyemovement field and argue how they need to be clarified in order to allow comparisons between world-fixed and head-fixed eye-tracking research. - Diaz et al. introduce machine-learning techniques for classifying the gaze signal in mobile eye-tracking contexts where head and body are unrestrained. Niehorster et al. compare data quality of mobile eye trackers during natural behavior and discuss the application range of these eye trackers. Duchowski et al. introduce a method for automatically mapping gaze to faces using computer vision techniques. Pelz et al. employ state-of-the-art techniques to map fixations to objects of interest in the scene video and align grasp and eye-movement data in the same reference frame to investigate the guidance of eye movements during manual interaction.


2019 ◽  
Author(s):  
Louisa Kulke

Emotional faces draw attention and eye-movements towards them. However, the neural mechanisms of attention have mainly been investigated during fixation, which is uncommon in everyday life where people move their eyes to shift attention to faces. Therefore, the current study combined eye-tracking and Electroencephalography (EEG) to measure neural mechanisms of overt attention shifts to faces with happy, neutral and angry expressions, allowing participants to move their eyes freely towards the stimuli. Saccade latencies towards peripheral faces did not differ depending on expression and early neural response (P1) amplitudes and latencies were unaffected. However, the later occurring Early Posterior Negativity (EPN) was significantly larger for emotional than for neutral faces. This response occurs after saccades towards the faces. Therefore, emotion modulations only occurred after an overt shift of gaze towards the stimulus had already been completed. Visual saliency rather than emotional content may therefore drive early saccades, while later top-down processes reflect emotion processing.


Author(s):  
Ellen Lirani-Silva ◽  
Samuel Stuart ◽  
Lucy Parrington ◽  
Kody Campbell ◽  
Laurie King

Background: Clinical and laboratory assessment of people with mild traumatic brain injury (mTBI) indicate impairments in eye movements. These tests are typically done in a static, seated position. Recently, the use of mobile eye-tracking systems has been proposed to quantify subtle deficits in eye movements and visual sampling during different tasks. However, the impact of mTBI on eye movements during functional tasks such as walking remains unknown.Objective: Evaluate differences in eye-tracking measures collected during gait between healthy controls (HC) and patients in the sub-acute stages of mTBI recovery and to determine if there are associations between eye-tracking measures and gait speed.Methods: Thirty-seven HC participants and 67individuals with mTBI were instructed to walk back and forth over 10-m, at a comfortable self-selected speed. A single 1-min trial was performed. Eye-tracking measures were recorded using a mobile eye-tracking system (head-mounted infra-red Tobbii Pro Glasses 2, 100 Hz, Tobii Technology Inc. VA, United States). Eye-tracking measures included saccadic (frequency, mean and peak velocity, duration and distance) and fixation measurements (frequency and duration). Gait was assessed using six inertial sensors (both feet, sternum, right wrist, lumbar vertebrae and the forehead) and gait velocity was selected as the primary outcome. General linear model was used to compare the groups and association between gait and eye-tracking outcomes were explored using partial correlations.Results: Individuals with mTBI showed significantly reduced saccade frequency (p = 0.016), duration (p = 0.028) and peak velocity (p = 0.032) compared to the HC group. No significant differences between groups were observed for the saccade distance, fixation measures and gait velocity (p > 0.05). A positive correlation was observed between saccade duration and gait velocity only for participants with mTBI (p = 0.025).Conclusion: Findings suggest impaired saccadic eye movement, but not fixations, during walking in individuals with mTBI. These findings have implications in real-world function including return to sport for athletes and return to duty for military service members. Future research should investigate whether or not saccade outcomes are influenced by the time after the trauma and rehabilitation.


2018 ◽  
Vol 5 (8) ◽  
pp. 180502 ◽  
Author(s):  
Roy S. Hessels ◽  
Diederick C. Niehorster ◽  
Marcus Nyström ◽  
Richard Andersson ◽  
Ignace T. C. Hooge

Eye movements have been extensively studied in a wide range of research fields. While new methods such as mobile eye tracking and eye tracking in virtual/augmented realities are emerging quickly, the eye-movement terminology has scarcely been revised. We assert that this may cause confusion about two of the main concepts: fixations and saccades. In this study, we assessed the definitions of fixations and saccades held in the eye-movement field, by surveying 124 eye-movement researchers. These eye-movement researchers held a variety of definitions of fixations and saccades, of which the breadth seems even wider than what is reported in the literature. Moreover, these definitions did not seem to be related to researcher background or experience. We urge researchers to make their definitions more explicit by specifying all the relevant components of the eye movement under investigation: (i) the oculomotor component: e.g. whether the eye moves slow or fast; (ii) the functional component: what purposes does the eye movement (or lack thereof) serve; (iii) the coordinate system used: relative to what does the eye move; (iv) the computational definition: how is the event represented in the eye-tracker signal. This should enable eye-movement researchers from different fields to have a discussion without misunderstandings.


2016 ◽  
Vol 9 (6) ◽  
Author(s):  
Flora Ioannidou ◽  
Frouke Hermens ◽  
Timothy L Hodgson

Eye tracking studies have suggested that, when viewing images centrally presented on a computer screen, observers tend to fixate the middle of the image. This so-called `central bias' was later also observed in mobile eye tracking during outdoors navigation, where observers were found to fixate the middle of the head-centered video image. It is unclear, however, whether the extension of the central bias to mobile eye tracking in outdoors navigation may have been due to the relatively long viewing distances towards objects in this task and the constant turning of the body in the direction of motion, both of which may have reduced the need for large amplitude eye movements. To examine whether the central bias in day-to-day viewing is related to the viewing distances involved, we here compare eye movements in three tasks (indoors navigation, tea making, and card sorting), each associated with interactions with objects at different viewing distances. Analysis of gaze positions showed a central bias for all three tasks that was independent of the task performed. These results confirm earlier observations of the central bias in mobile eye tracking data, and suggest that differences in the typical viewing distance during different tasks have little effect on the bias. The results could have interesting technological applications, in which the bias is used to estimate the direction of gaze from head-centered video images, such as those obtained from wearable technology.


2018 ◽  
Vol 2018 ◽  
pp. 1-8 ◽  
Author(s):  
Metin Yildiz ◽  
Hesna Özbek Ülkütaş

Some disadvantages of optical eye tracking systems have increased the interest to EOG (Electrooculography) based Human Computer Interaction (HCI). However, text entry attempts using EOG have been slower than expected because the eyes should move several times for entering a character. In order to improve the writing speed and accuracy of EOG based text entry, a new method based on the coding of eye movements has been suggested in this study. In addition, a real time EOG based HCI system has developed to implement the method. In our method all characters have been encoded by single saccades in 8 directions and different dwell time. In order to standardize dwell times and facilitate the coding process, computer assisted voice guidance was used. A number of experiments have been conducted to examine the effectiveness of the proposed method and system. At the end of the fifth trials, an experienced user was able to write at average 13.2 wpm (5 letters = 1 word) with 100% accuracy using the developed system. The results of our experiments have shown that text entry with the eye can be done quickly and efficiently with the proposed method and system.


Sign in / Sign up

Export Citation Format

Share Document