scholarly journals Development of Training Game Application using Eye-gaze Control Technology to Support Employment of Physically challenged people

Author(s):  
Ryoya Goto ◽  
Kimiyasu Kiyota ◽  
Manabu Shimakawa ◽  
Koichiro Watanabe ◽  
Chiharu Okuma
BMC Neurology ◽  
2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Petra Karlsson ◽  
Tom Griffiths ◽  
Michael T. Clarke ◽  
Elegast Monbaliu ◽  
Kate Himmelmann ◽  
...  

Abstract Background Limited research exists to guide clinical decisions about trialling, selecting, implementing and evaluating eye-gaze control technology. This paper reports on the outcomes of a Delphi study that was conducted to build international stakeholder consensus to inform decision making about trialling and implementing eye-gaze control technology with people with cerebral palsy. Methods A three-round online Delphi survey was conducted. In Round 1, 126 stakeholders responded to questions identified through an international stakeholder Advisory Panel and systematic reviews. In Round 2, 63 respondents rated the importance of 200 statements generated by in Round 1. In Round 3, 41 respondents rated the importance of the 105 highest ranked statements retained from Round 2. Results Stakeholders achieved consensus on 94 of the original 200 statements. These statements related to person factors, support networks, the environment, and technical aspects to consider during assessment, trial, implementation and follow-up. Findings reinforced the importance of an individualised approach and that information gathered from the user, their support network and professionals are central when measuring outcomes. Information required to support an application for funding was obtained. Conclusion This Delphi study has identified issues which are unique to eye-gaze control technology and will enhance its implementation with people with cerebral palsy.


Technologies ◽  
2018 ◽  
Vol 6 (1) ◽  
pp. 12 ◽  
Author(s):  
Helena Hemmingsson ◽  
Gunnar Ahlsten ◽  
Helena Wandin ◽  
Patrik Rytterström ◽  
Maria Borgestig

2018 ◽  
Vol 22 (2) ◽  
pp. 134-140 ◽  
Author(s):  
Petra Karlsson ◽  
Anna Bech ◽  
Helen Stone ◽  
Cecily Vale ◽  
Suzan Griffin ◽  
...  

Author(s):  
James Kim

The purpose of this study was to examine factors that influence how people look at objects they will have to act upon while watching others interact with them first. We investigated whether including different types of task-relevant information into an observational learning task would result in participants adapting their gaze towards an object with more task-relevant information. The participant watched an actor simultaneously lift and replace two objects with two hands then was cued to lift one of the two objects. The objects had the potential to change weight between each trial. In our cue condition, participants were cued to lift one of the objects every single time. In our object condition, the participants were cued equally to act on both objects; however, the weights of only one of the objects would have the potential to change. The hypothesis in the cue condition was that the participant would look significantly more at the object being cued. The hypothesis for the object condition was that the participant would look significantly more (i.e. adapt their gaze) at the object changing weight. The rationale behind this is that participants will learn to allocate their gaze significantly more towards that object so they can gain information about its properties (i.e. weight change). Pending results will indicate whether or not this occurred, and has implications for understanding eye movement sequences in visually guided behaviour tasks. The outcome of this study also has implications for the mechanisms of eye gaze with respect to social learning tasks. 


Symmetry ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 680
Author(s):  
Ethan Jones ◽  
Winyu Chinthammit ◽  
Weidong Huang ◽  
Ulrich Engelke ◽  
Christopher Lueg

Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone ( p < 0.05 ) and was competitive with the controller only setup, although did not outperform it ( p > 0.05 ).


Author(s):  
Mick Donegan ◽  
Päivi Majaranta ◽  
John Paulin Hansen ◽  
Aulikki Hyrskykari ◽  
Hirotaka Aoki ◽  
...  

Gaze-controlled computers had already been utilized successfully for well over two decades before the COGAIN project started. However, those actually benefitting from the technology were comparatively few compared to the numbers who needed it. During the five year course of the project, however, systems, software and strategies were developed that made this technology potentially available, given appropriate support and technology, to groups who might not have even considered eye control a possibility. As a result, gaze control technology was opened up to a much wider group of people. In this final chapter, we sum up research presented in this book and close it by presenting some future trends and areas with high potential for applied use of eye tracking and gaze interaction.


Sign in / Sign up

Export Citation Format

Share Document