Collaborative eye tracking based code review through real-time shared gaze visualization

2021 ◽  
Vol 16 (3) ◽  
Author(s):  
Shiwei Cheng ◽  
Jialing Wang ◽  
Xiaoquan Shen ◽  
Yijian Chen ◽  
Anind Dey
2021 ◽  
Author(s):  
Cian Ryan ◽  
Brian O’Sullivan ◽  
Amr Elrasad ◽  
Aisling Cahill ◽  
Joe Lemley ◽  
...  

2021 ◽  
Author(s):  
Yasith Jayawardana ◽  
Gavindya Jayawardena ◽  
Andrew T. Duchowski ◽  
Sampath Jayarathna

Author(s):  
Sarah D’Angelo ◽  
Bertrand Schneider

Abstract The past decade has witnessed a growing interest for using dual eye tracking to understand and support remote collaboration, especially with studies that have established the benefits of displaying gaze information for small groups. While this line of work is promising, we lack a consistent framework that researchers can use to organize and categorize studies on the effect of shared gaze on social interactions. There exists a wide variety of terminology and methods for describing attentional alignment; researchers have used diverse techniques for designing gaze visualizations. The settings studied range from real-time peer collaboration to asynchronous viewing of eye-tracking video of an expert providing explanations. There has not been a conscious effort to synthesize and understand how these different approaches, techniques and applications impact the effectiveness of shared gaze visualizations (SGVs). In this paper, we summarize the related literature and the benefits of SGVs for collaboration, describe important terminology as well as appropriate measures for the dual eye-tracking space and discuss promising directions for future research. As eye-tracking technology becomes more ubiquitous, there is pressing need to develop a consistent approach to evaluation and design of SGVs. The present paper makes a first and significant step in this direction.


Author(s):  
Mohammad Norouzifard ◽  
Joanna Black ◽  
Benjamin Thompson ◽  
Reinhard Klette ◽  
Jason Turuwhenua

2020 ◽  
pp. 1-10
Author(s):  
Bruno Gepner ◽  
Anaïs Godde ◽  
Aurore Charrier ◽  
Nicolas Carvalho ◽  
Carole Tardif

Abstract Facial movements of others during verbal and social interaction are often too rapid to be faced and/or processed in time by numerous children and adults with autism spectrum disorder (ASD), which could contribute to their face-to-face interaction peculiarities. We wish here to measure the effect of reducing the speed of one's facial dynamics on the visual exploration of the face by children with ASD. Twenty-three children with ASD and 29 typically-developing control children matched for chronological age passively viewed a video of a speaker telling a story at various velocities, i.e., a real-time speed and two slowed-down speeds. The visual scene was divided into four areas of interest (AOI): face, mouth, eyes, and outside the face. With an eye-tracking system, we measured the percentage of total fixation duration per AOI and the number and mean duration of the visual fixations made on each AOI. In children with ASD, the mean duration of visual fixations on the mouth region, which correlated with their verbal level, increased at slowed-down velocity compared with the real-time one, a finding which parallels a result also found in the control children. These findings strengthen the therapeutic potential of slowness for enhancing verbal and language abilities in children with ASD.


Sign in / Sign up

Export Citation Format

Share Document