Eye Tracking in Framework for the Development of Games for People with Motor Disabilities

Author(s):  
Francisco M. A. de Araújo ◽  
Nuno Miguel Fonseca Ferreira ◽  
Vinicius Tomaz O. C. Mascarenhas ◽  
Jesus Abrahão Adad Filho ◽  
Paulo Roberto F. Viana Filho
2014 ◽  
Vol 644-650 ◽  
pp. 1330-1333
Author(s):  
Xiang Sheng Wang

Thanks to recent technological advances in the field of eye tracking, eye typing provides means of communication for people with severe disabilities. Typing with gaze using dwell time has been made possible by the development of eye tracking technologies. Recent research indicates that pupil size is viewed as a subtle cue of people is making a decision. Therefore, it may help to infer users’ willing of typing. The present study describes the design process for improving eye typing by adding pupil size index into dwell time triggering. Experimental evaluations showed that the approach was effective; design considerations for such optimization of the gaze typing interfaces are discussed.


2018 ◽  
Vol 13 (3) ◽  
pp. 7-20
Author(s):  
Giselle Schmidt A. D. Merino ◽  
Carmen Elena Martinez Riascos ◽  
Angelina Dias Leão Costa ◽  
Gleice Virginia Medeiros de Azambuja Elali ◽  
Eugenio Merino

Make the environment that can be achieved, fires, used and experienced by anyone, including those with reduced mobility, is an increasingly important need for professionals. Being the eye tracking is an assistive technology that enables you to identify objectively the visual perception was held an experiment that allows analyzing the people’s difficulties in internal visual identification on buildings. The article goal is to identify the focus of visual attention in people with motor disabilities using eye tracking glasses. To perform the experiment was used Senso Motoric Instruments (SMI) eye tracking glasses and was did analyses with the BeGaze software version 3.6.  The results indicate the lack of visual information causes difficulties for people to locate and identify the correct route for the offset inside a building, reducing the subjectivity in making decisions to make accessible environments.  The tests show that the participants do not have fixed their gaze on specific points, because it remained looking for visual information into the building generating lack of orientation and difficulties to define the right route at offset. With this experiment was possible to validate an application of the device to contribute to the decision-making process of professionals to make accessible environments. In addition, they recognized the particularities in the use of Assistive Technology, the glasses eye tracker, and the possibility of being used in the analysis of various tasks contributing in the Design, in the Architecture, and the Engineering.


2020 ◽  
Vol 63 (7) ◽  
pp. 2245-2254 ◽  
Author(s):  
Jianrong Wang ◽  
Yumeng Zhu ◽  
Yu Chen ◽  
Abdilbar Mamat ◽  
Mei Yu ◽  
...  

Purpose The primary purpose of this study was to explore the audiovisual speech perception strategies.80.23.47 adopted by normal-hearing and deaf people in processing familiar and unfamiliar languages. Our primary hypothesis was that they would adopt different perception strategies due to different sensory experiences at an early age, limitations of the physical device, and the developmental gap of language, and others. Method Thirty normal-hearing adults and 33 prelingually deaf adults participated in the study. They were asked to perform judgment and listening tasks while watching videos of a Uygur–Mandarin bilingual speaker in a familiar language (Standard Chinese) or an unfamiliar language (Modern Uygur) while their eye movements were recorded by eye-tracking technology. Results Task had a slight influence on the distribution of selective attention, whereas subject and language had significant influences. To be specific, the normal-hearing and the d10eaf participants mainly gazed at the speaker's eyes and mouth, respectively, in the experiment; moreover, while the normal-hearing participants had to stare longer at the speaker's mouth when they confronted with the unfamiliar language Modern Uygur, the deaf participant did not change their attention allocation pattern when perceiving the two languages. Conclusions Normal-hearing and deaf adults adopt different audiovisual speech perception strategies: Normal-hearing adults mainly look at the eyes, and deaf adults mainly look at the mouth. Additionally, language and task can also modulate the speech perception strategy.


Author(s):  
Pirita Pyykkönen ◽  
Juhani Järvikivi

A visual world eye-tracking study investigated the activation and persistence of implicit causality information in spoken language comprehension. We showed that people infer the implicit causality of verbs as soon as they encounter such verbs in discourse, as is predicted by proponents of the immediate focusing account ( Greene & McKoon, 1995 ; Koornneef & Van Berkum, 2006 ; Van Berkum, Koornneef, Otten, & Nieuwland, 2007 ). Interestingly, we observed activation of implicit causality information even before people encountered the causal conjunction. However, while implicit causality information was persistent as the discourse unfolded, it did not have a privileged role as a focusing cue immediately at the ambiguous pronoun when people were resolving its antecedent. Instead, our study indicated that implicit causality does not affect all referents to the same extent, rather it interacts with other cues in the discourse, especially when one of the referents is already prominently in focus.


Author(s):  
Paul A. Wetzel ◽  
Gretchen Krueger-Anderson ◽  
Christine Poprik ◽  
Peter Bascom

Sign in / Sign up

Export Citation Format

Share Document