scholarly journals Interlocutors and Interactions: Examining the Interactions Between Students With Complex Communication Needs, Teachers, and Eye-Gaze Technology

2020 ◽  
Vol 1 ◽  
pp. 113-131
Author(s):  
Rhonda McEwen ◽  
Asiya Atcha ◽  
Michelle Lui ◽  
Roula Shimaly ◽  
Amrita Maharaj ◽  
...  

This study analyzes the role of the machine as a communicative partner for children with complex communication needs as they use eye-tracking technology to communicate. We ask: to what extent do eye-tracking devices serve as functional communications systems for children with complex communication needs? We followed 12 children with profound physical disabilities in a special education classroom over 3 months. An eye-tracking system was used to collect data from software that assisted the children in facial recognition, task identification, and vocabulary building. Results show that eye gaze served as a functional communication system for the majority of the children. We found voice affect to be a strong determinant of communicative success between students and both of their communicative partners: the teachers (humans) and the technologies (machines).

2021 ◽  
pp. 016264342110193
Author(s):  
Michelle Lui ◽  
Amrita Maharaj ◽  
Roula Shimaly ◽  
Asiya Atcha ◽  
Hamza Ali ◽  
...  

This study examines interactions between students with atypical motor and speech abilities, their teachers, and eye tracking devices under varying conditions typical of educational settings (e.g., interactional style, teacher familiarity). Twelve children (aged 4–12 years) participated in teacher-guided sessions with eye tracking software that are designed to develop augmentative and alternative communication (AAC) skills. Assessments of expressive communication skills before and after the testing period demonstrated significant improvements. 164 sessions conducted over a 3-month period were analyzed for positive engagement (e.g., gaze direction, session time) and system effectiveness (e.g., lag time, gaze registration) between integrated and non-integrated systems. Findings showed that integrated systems were associated with significantly longer sessions, more time spent looking at the screen, greater proportion of gaze targets registered by the system, and higher response rate to prompts from teachers. We discuss the implications for the facilitated use of eye tracking devices in special education classrooms.


2014 ◽  
Vol 23 (1) ◽  
pp. 42-54 ◽  
Author(s):  
Tanya Rose Curtis

As the field of telepractice grows, perceived barriers to service delivery must be anticipated and addressed in order to provide appropriate service delivery to individuals who will benefit from this model. When applying telepractice to the field of AAC, additional barriers are encountered when clients with complex communication needs are unable to speak, often present with severe quadriplegia and are unable to position themselves or access the computer independently, and/or may have cognitive impairments and limited computer experience. Some access methods, such as eye gaze, can also present technological challenges in the telepractice environment. These barriers can be overcome, and telepractice is not only practical and effective, but often a preferred means of service delivery for persons with complex communication needs.


Author(s):  
Federico Cassioli ◽  
Laura Angioletti ◽  
Michela Balconi

AbstractHuman–computer interaction (HCI) is particularly interesting because full-immersive technology may be approached differently by users, depending on the complexity of the interaction, users’ personality traits, and their motivational systems inclination. Therefore, this study investigated the relationship between psychological factors and attention towards specific tech-interactions in a smart home system (SHS). The relation between personal psychological traits and eye-tracking metrics is investigated through self-report measures [locus of control (LoC), user experience (UX), behavioral inhibition system (BIS) and behavioral activation system (BAS)] and a wearable and wireless near-infrared illumination based eye-tracking system applied to an Italian sample (n = 19). Participants were asked to activate and interact with five different tech-interaction areas with different levels of complexity (entrance, kitchen, living room, bathroom, and bedroom) in a smart home system (SHS), while their eye-gaze behavior was recorded. Data showed significant differences between a simpler interaction (entrance) and a more complex one (living room), in terms of number of fixation. Moreover, slower time to first fixation in a multifaceted interaction (bathroom), compared to simpler ones (kitchen and living room) was found. Additionally, in two interaction conditions (living room and bathroom), negative correlations were found between external LoC and fixation count, and between BAS reward responsiveness scores and fixation duration. Findings led to the identification of a two-way process, where both the complexity of the tech-interaction and subjects’ personality traits are important impacting factors on the user’s visual exploration behavior. This research contributes to understand the user responsiveness adding first insights that may help to create more human-centered technology.


Author(s):  
Marin Vuković ◽  
Željka Car ◽  
Jasmina Ivšac Pavlisa ◽  
Lidija Mandić

Wearables may have notable potential as an assistive technology for persons with various difficulties. Although quite popular, smartwatches' niches are still revealing. One of them is definitely in the domain of assistive technology due to their communication and location features. Positioning features enable parents or caregivers to know the whereabouts of child or persons with disabilities, thus increasing their safety. The paper presents smartwatch tracking system for people with complex communication needs with emphasis on detection of smartwatch wearer's common movement routes. The application is a result of multidisciplinary research performed in the area of information and communication technology, as an assistive technology aiming to explore the technological possibilities of connecting new generations of mobile devices and their technological supplements, or wearables, in order to establish a different communication and location aids.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Alexandros Karargyris ◽  
Satyananda Kashyap ◽  
Ismini Lourentzou ◽  
Joy T. Wu ◽  
Arjun Sharma ◽  
...  

AbstractWe developed a rich dataset of Chest X-Ray (CXR) images to assist investigators in artificial intelligence. The data were collected using an eye-tracking system while a radiologist reviewed and reported on 1,083 CXR images. The dataset contains the following aligned data: CXR image, transcribed radiology report text, radiologist’s dictation audio and eye gaze coordinates data. We hope this dataset can contribute to various areas of research particularly towards explainable and multimodal deep learning/machine learning methods. Furthermore, investigators in disease classification and localization, automated radiology report generation, and human-machine interaction can benefit from these data. We report deep learning experiments that utilize the attention maps produced by the eye gaze dataset to show the potential utility of this dataset.


2021 ◽  
Vol 2120 (1) ◽  
pp. 012030
Author(s):  
J K Tan ◽  
W J Chew ◽  
S K Phang

Abstract The field of Human-Computer Interaction (HCI) has been developing tremendously since the past decade. The existence of smartphones or modern computers is already a norm in society these days which utilizes touch, voice and typing as a means for input. To further increase the variety of interaction, human eyes are set to be a good candidate for another form of HCI. The amount of information which the human eyes contain are extremely useful, hence, various methods and algorithm for eye gaze tracking are implemented in multiple sectors. However, some eye-tracking method requires infrared rays to be projected into the eye of the user which could potentially cause enzyme denaturation when the eye is subjected to those rays under extreme exposure. Therefore, to avoid potential harm from the eye-tracking method that utilizes infrared rays, this paper proposes an image-based eye tracking system using the Viola-Jones algorithm and Circular Hough Transform (CHT) algorithm. The proposed method uses visible light instead of infrared rays to control the mouse pointer using the eye gaze of the user. This research aims to implement the proposed algorithm for people with hand disability to interact with computers using their eye gaze.


2020 ◽  
Vol 12 (2) ◽  
pp. 43
Author(s):  
Mateusz Pomianek ◽  
Marek Piszczek ◽  
Marcin Maciejewski ◽  
Piotr Krukowski

This paper describes research on the stability of the MEMS mirror for use in eye tracking systems. MEMS mirrors are the main element in scanning methods (which is one of the methods of eye tracking). Due to changes in the mirror pitch, the system can scan the area of the eye with a laser and collect the signal reflected. However, this method works on the assumption that the inclinations are constant in each period. The instability of this causes errors. The aim of this work is to examine the error level caused by pitch instability at different points of work. Full Text: PDF ReferencesW. Fuhl, M. Tonsen, A. Bulling, and E. Kasneci, "Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art," Mach. Vis. Appl., vol. 27, no. 8, pp. 1275-1288, 2016, CrossRef X. Wang, S. Koch, K. Holmqvist, and M. Alexa, "Tracking the gaze on objects in 3D," ACM Trans. Graph., vol. 37, no. 6, pp. 1-18, Dec. 2018 CrossRef X. Xiong and H. Xie, "MEMS dual-mode electrostatically actuated micromirror," Proc. 2014 Zo. 1 Conf. Am. Soc. Eng. Educ. - "Engineering Educ. Ind. Involv. Interdiscip. Trends", ASEE Zo. 1 2014, no. Dmd, 2014 CrossRef E. Pengwang, K. Rabenorosoa, M. Rakotondrabe, and N. Andreff, "Scanning micromirror platform based on MEMS technology for medical application," Micromachines, vol. 7, no. 2, 2016 CrossRef J. P. Giannini, A. G. York, and H. Shroff, "Anticipating, measuring, and minimizing MEMS mirror scan error to improve laser scanning microscopy's speed and accuracy," PLoS One, vol. 12, no. 10, pp. 1-14, 2017 CrossRef C. Hennessey, B. Noureddin, and P. Lawrence, "A single camera eye-gaze tracking system with free head motion," Eye Track. Res. Appl. Symp., vol. 2005, no. March, pp. 87-94, 2005 CrossRef C. H. Morimoto and M. R. M. Mimica, "Eye gaze tracking techniques for interactive applications," Comput. Vis. Image Underst., vol. 98, no. 1, pp. 4-24, Apr. 2005 CrossRef S. T. S. Holmström, U. Baran, and H. Urey, "MEMS laser scanners: A review," J. Microelectromechanical Syst., vol. 23, no. 2, pp. 259-275, 2014 CrossRef C. W. Cho, "Gaze Detection by Wearable Eye-Tracking and NIR LED-Based Head-Tracking Device Based on SVR," ETRI J., vol. 34, no. 4, pp. 542-552, Aug. 2012 CrossRef T. Santini, W. Fuhl, and E. Kasneci, "PuRe: Robust pupil detection for real-time pervasive eye tracking," Comput. Vis. Image Underst., vol. 170, pp. 40-50, May 2018 CrossRef O. Solgaard, A. A. Godil, R. T. Howe, L. P. Lee, Y. A. Peter, and H. Zappe, "Optical MEMS: From micromirrors to complex systems," J. Microelectromechanical Syst., vol. 23, no. 3, pp. 517-538, 2014 CrossRef J. Wang, G. Zhang, and Z. You, "UKF-based MEMS micromirror angle estimation for LiDAR," J. Micromechanics Microengineering, vol. 29, no. 3, 201 CrossRef


2021 ◽  
pp. 112972982098736
Author(s):  
Kaji Tatsuru ◽  
Yano Keisuke ◽  
Onishi Shun ◽  
Matsui Mayu ◽  
Nagano Ayaka ◽  
...  

Purpose: Real-time ultrasound (RTUS)-guided central venipuncture using the short-axis approach is complicated and likely to result in losing sight of the needle tip. Therefore, we focused on the eye gaze in our evaluation of the differences in eye gaze between medical students and experienced participants using an eye tracking system. Methods: Ten medical students (MS group), five residents (R group) and six pediatric surgeon fellows (F group) performed short-axis RTUS-guided venipuncture simulation using a modified vessel training system. The eye gaze was captured by the tracking system (Tobii Eye Tacker 4C) and recorded. The evaluation endpoints were the task completion time, total time and number of occurrences of the eye tracking marker outside US monitor and success rate of venipuncture. Result: There were no significant differences in the task completion time and total time of the tracking marker outside the US monitor. The number of occurrences of the eye tracking marker outside US monitor in the MS group was significantly higher than in the F group (MS group: 9.5 ± 3.4, R group: 6.0 ± 2.9, F group: 5.2 ± 1.6; p  = 0.04). The success rate of venipuncture in the R group tended to be better than in the F group. Conclusion: More experienced operators let their eye fall outside the US monitor fewer times than less experienced ones. The eye gaze was associated with the success rate of RTUS-guided venipuncture. Repeated training while considering the eye gaze seems to be pivotal for mastering RTUS-guided venipuncture.


2018 ◽  
Vol 61 (12) ◽  
pp. 2917-2933 ◽  
Author(s):  
Matthew James Valleau ◽  
Haruka Konishi ◽  
Roberta Michnick Golinkoff ◽  
Kathy Hirsh-Pasek ◽  
Sudha Arunachalam

Purpose We examined receptive verb knowledge in 22- to 24-month-old toddlers with a dynamic video eye-tracking test. The primary goal of the study was to examine the utility of eye-gaze measures that are commonly used to study noun knowledge for studying verb knowledge. Method Forty typically developing toddlers participated. They viewed 2 videos side by side (e.g., girl clapping, same girl stretching) and were asked to find one of them (e.g., “Where is she clapping?”). Their eye-gaze, recorded by a Tobii T60XL eye-tracking system, was analyzed as a measure of their knowledge of the verb meanings. Noun trials were included as controls. We examined correlations between eye-gaze measures and score on the MacArthur–Bates Communicative Development Inventories (CDI; Fenson et al., 1994), a standard parent report measure of expressive vocabulary to see how well various eye-gaze measures predicted CDI score. Results A common measure of knowledge—a 15% increase in looking time to the target video from a baseline phase to the test phase—did correlate with CDI score but operationalized differently for verbs than for nouns. A 2nd common measure, latency of 1st look to the target, correlated with CDI score for nouns, as in previous work, but did not for verbs. A 3rd measure, fixation density, correlated for both nouns and verbs, although the correlation went in different directions. Conclusions The dynamic nature of videos depicting verb knowledge results in differences in eye-gaze as compared to static images depicting nouns. An eye-tracking assessment of verb knowledge is worthwhile to develop. However, the particular dependent measures used may be different than those used for static images and nouns.


2017 ◽  
Vol 29 (2) ◽  
pp. 262-269 ◽  
Author(s):  
Timothy Stapleton ◽  
Helen Sumin Koo

Purpose The purpose of this paper is to investigate the effectiveness of biomotion visibility aids for nighttime bicyclists compared to other configurations via 3D eye-tracking technology in a blind between-subjects experiment. Design/methodology/approach A total of 40 participants were randomly assigned one of four visibility aid conditions in the form of videos: biomotion (retroreflective knee and ankle bands), non-biomotion (retroreflective vest configuration), pseudo-biomotion (vertical retroreflective stripes on the back of the legs), and control (all-black clothing). Gaze fixations on a screen were measured with a 3D eye-tracking system; coordinate data for each condition were analyzed via one-way ANOVA and Tukey’s post-hoc analyses with supplementary heatmaps. Post-experimental questionnaires addressed participants’ qualitative assessments. Findings Significant differences in eye gaze location were found between the four reflective clothing design conditions in X-coordinate values (p<0.01) and Y-coordinate values (p<0.05). Practical implications This research has the potential to further inform clothing designers and manufacturers on how to incorporate biomotion to increase bicyclist visibility and safety. Social implications This research has the potential to benefit both drivers and nighttime bicyclists through a better understanding of how biomotion can increase visibility and safety. Originality/value There is lack of literature addressing the issue of the commonly administered experimental task of recognizing bicyclists and its potential bias on participants’ attention and natural driving state. Eye-tracking has the potential to implicitly determine attention and visibility, devoid of biases to attention. A new retroreflective visibility aid design, pseudo-biomotion, was also introduced in this experiment.


Sign in / Sign up

Export Citation Format

Share Document