scholarly journals Infant Eye Gaze While Viewing Dynamic Faces

2021 ◽  
Vol 11 (2) ◽  
pp. 231
Author(s):  
Lisa M. Oakes ◽  
Michaela C. DeBolt ◽  
Aaron G. Beckner ◽  
Annika T. Voss ◽  
Lisa M. Cantrell

Research using eye tracking methods has revealed that when viewing faces, between 6 to 10 months of age, infants begin to shift visual attention from the eye region to the mouth region. Moreover, this shift varies with stimulus characteristics and infants’ experience with faces and languages. The current study examined the eye movements of a racially diverse sample of 98 infants between 7.5 and 10.5 months of age as they viewed movies of White and Asian American women reciting a nursery rhyme (the auditory component of the movies was replaced with music to eliminate the influence of the speech on infants’ looking behavior). Using an analytic approach inspired by the multiverse analysis approach, several measures from infants’ eye gaze were examined to identify patterns that were robust across different analyses. Although in general infants preferred the lower regions of the faces, i.e., the region containing the mouth, this preference depended on the stimulus characteristics and was stronger for infants whose typical experience included faces of more races and for infants who were exposed to multiple languages. These results show how we can leverage the richness of eye tracking data with infants to add to our understanding of the factors that influence infants’ visual exploration of faces.

Author(s):  
Federico Cassioli ◽  
Laura Angioletti ◽  
Michela Balconi

AbstractHuman–computer interaction (HCI) is particularly interesting because full-immersive technology may be approached differently by users, depending on the complexity of the interaction, users’ personality traits, and their motivational systems inclination. Therefore, this study investigated the relationship between psychological factors and attention towards specific tech-interactions in a smart home system (SHS). The relation between personal psychological traits and eye-tracking metrics is investigated through self-report measures [locus of control (LoC), user experience (UX), behavioral inhibition system (BIS) and behavioral activation system (BAS)] and a wearable and wireless near-infrared illumination based eye-tracking system applied to an Italian sample (n = 19). Participants were asked to activate and interact with five different tech-interaction areas with different levels of complexity (entrance, kitchen, living room, bathroom, and bedroom) in a smart home system (SHS), while their eye-gaze behavior was recorded. Data showed significant differences between a simpler interaction (entrance) and a more complex one (living room), in terms of number of fixation. Moreover, slower time to first fixation in a multifaceted interaction (bathroom), compared to simpler ones (kitchen and living room) was found. Additionally, in two interaction conditions (living room and bathroom), negative correlations were found between external LoC and fixation count, and between BAS reward responsiveness scores and fixation duration. Findings led to the identification of a two-way process, where both the complexity of the tech-interaction and subjects’ personality traits are important impacting factors on the user’s visual exploration behavior. This research contributes to understand the user responsiveness adding first insights that may help to create more human-centered technology.


2017 ◽  
Vol 8 (4) ◽  
pp. 296-307 ◽  
Author(s):  
Stephanie N. Wong ◽  
Brian TaeHyuk Keum ◽  
Daniel Caffarel ◽  
Ranjana Srinivasan ◽  
Negar Morshedian ◽  
...  

2018 ◽  
Vol 65 (5) ◽  
pp. 571-585 ◽  
Author(s):  
Brian TaeHyuk Keum ◽  
Jennifer L. Brady ◽  
Rajni Sharma ◽  
Yun Lu ◽  
Young Hwa Kim ◽  
...  

2014 ◽  
Author(s):  
Marissa C. Floro ◽  
Hanna Chang ◽  
Bernasha Andersen ◽  
Nickecia Alder ◽  
Meghan Roche

2014 ◽  
Author(s):  
Aylin E. Kaya ◽  
Alice W. Cheng ◽  
Margaux M. Grivel ◽  
Lauren Clinton ◽  
Patty Kuo ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document