eye contact
Recently Published Documents


TOTAL DOCUMENTS

887
(FIVE YEARS 203)

H-INDEX

50
(FIVE YEARS 5)

Cognition ◽  
2022 ◽  
Vol 220 ◽  
pp. 104981
Author(s):  
Colin J. Palmer ◽  
Sophia G. Bracken ◽  
Yumiko Otsuka ◽  
Colin W.G. Clifford
Keyword(s):  

2022 ◽  
Vol 2 ◽  
Author(s):  
Shane L. Rogers ◽  
Rebecca Broadbent ◽  
Jemma Brown ◽  
Alan Fraser ◽  
Craig P. Speelman

This study evaluated participant self-reported appraisal of social interactions with another person in virtual reality (VR) where their conversational partner was represented by a realistic motion avatar. We use the term realistic motion avatar because: 1. The avatar was modelled to look like the conversational partner it represented, and 2. Full face and body motion capture was utilised so that the avatar mimicked the facial and body language of the conversational partner in real-time. We compared social interaction in VR with face-to-face interaction across two communicative contexts: 1. Getting acquainted conversation, and 2. A structured interview where the participant engaged in self-disclosure about positive and negative experiences. Overall, participants largely indicated they preferred face-to-face over VR communication. However, some participants did indicate a preference for VR communication. Additionally, an analysis of post-conversation ratings indicated no significant difference for rated enjoyment, understanding, self-disclosure, comfort, and awkwardness between communication modes. The only ratings where face-to-face was found to be superior was for perceived closeness across both types of communication, and for feeling understood specifically when disclosing negative experiences. Most participants perceived frequent eye contact in both face-to-face and VR interaction, but typically more eye contact when face-to-face. Eye contact was positively associated with rated enjoyment, closeness, and comfort. Overall, our findings suggest that harnessing full face and body motion capture can make social interaction in VR very similar to face-to-face interaction. We anticipate that VR social interaction is poised to become the next major technological evolution for human computer mediated communication and suggest avenues for further research.


Author(s):  
V. Onkhar ◽  
P. Bazilinskyy ◽  
D. Dodou ◽  
J.C.F. de Winter
Keyword(s):  

2021 ◽  
Author(s):  
Alice Gomez ◽  
Guillaume Lio ◽  
Manuela Costa ◽  
Angela Sirigu ◽  
Caroline Demily

Abstract Background: Williams syndrome (WS) and Autism Spectrum Disorders (ASD) are psychiatric conditions associated with atypical but opposite face-to-face interactions patterns: WS patients overly stare at others, ASD individuals escape eye contact. Whether these behaviors result from dissociable visual processes within the occipito-temporal pathways is unknown. Using high-density electroencephalography, multivariate signal processing algorithms and a protocol designed to identify and extract evoked activities sensitive to facial cues, we investigated how WS (N=14), ASD (N=14) and neurotypical subjects (N=14) decode the information content of a face stimulus. Results: We found two neural components in neurotypical participants, both strongest when the eye region was projected onto the subject's fovea, simulating a direct eye contact situation, and weakest over more distant regions, reaching a minimum when the focused region was outside the stimulus face. The first component peaks at 170ms, an early signal known to be implicated in low-level face features. The second is identified later, 260ms post-stimulus onset and is implicated in decoding salient face social cues.Remarkably, both components were found distinctly impaired and preserved in WS and ASD. In WS, we could weakly decode the 170ms signal based on our regressor relative to facial features, probably due to their relatively poor ability to process faces’ morphology, while the late 260ms component was highly significant. The reverse pattern was observed in ASD participants who showed neurotypical like early 170ms evoked activity but impaired late evoked 260ms signal. Conclusions: Our study reveals a dissociation between WS and ASD patients and point at different neural origins for their social impairments.


2021 ◽  
Author(s):  
Kyveli Kompatsiari ◽  
Francesco Bossi ◽  
Agnieszka Wykowska

Eye contact established by a human partner has been shown to affect various cognitive processes of the receiver. However, little is known about humans’ responses to eye contact established by a humanoid robot. Here, we aimed at examining humans’ oscillatory brain response to eye contact with a humanoid robot. Eye contact (or lack thereof) was embedded in a gaze cueing task and preceded the phase of gaze-related attentional orienting. In addition to examining the effect of eye contact on the recipient, we also tested its impact on gaze cueing effects. Results showed that participants rated eye contact as more engaging and responded with higher desynchronization of alpha-band activity in left fronto-central and central electrode clusters when the robot established eye contact with them, compared to no eye contact condition. However, eye contact did not modulate gaze cueing effects. The results are interpreted in terms of the functional roles involved in alpha central rhythms (potentially interpretable also as mu rhythm), including joint attention and engagement in social interaction.


2021 ◽  
Vol 11 (12) ◽  
pp. 1555
Author(s):  
Gianpaolo Alvari ◽  
Luca Coviello ◽  
Cesare Furlanello

The high level of heterogeneity in Autism Spectrum Disorder (ASD) and the lack of systematic measurements complicate predicting outcomes of early intervention and the identification of better-tailored treatment programs. Computational phenotyping may assist therapists in monitoring child behavior through quantitative measures and personalizing the intervention based on individual characteristics; still, real-world behavioral analysis is an ongoing challenge. For this purpose, we designed EYE-C, a system based on OpenPose and Gaze360 for fine-grained analysis of eye-contact episodes in unconstrained therapist-child interactions via a single video camera. The model was validated on video data varying in resolution and setting, achieving promising performance. We further tested EYE-C on a clinical sample of 62 preschoolers with ASD for spectrum stratification based on eye-contact features and age. By unsupervised clustering, three distinct sub-groups were identified, differentiated by eye-contact dynamics and a specific clinical phenotype. Overall, this study highlights the potential of Artificial Intelligence in categorizing atypical behavior and providing translational solutions that might assist clinical practice.


2021 ◽  
pp. 014544552110540
Author(s):  
Hide Okuno ◽  
Taylor Rezeppa ◽  
Tabitha Raskin ◽  
Andres De Los Reyes

Socially anxious adolescents often endure anxiety-provoking situations using safety behaviors: strategies for minimizing in-the-moment distress (e.g., avoiding eye contact, rehearsing statements before entering a conversation). Studies linking safety behaviors to impaired functioning have largely focused on adults. In a sample of one hundred thirty-four 14 to 15 year-old adolescents, we tested whether levels of safety behaviors among socially anxious adolescents relate to multiple domains of impaired functioning. Adolescents, parents, and research personnel completed survey measures of safety behaviors and social anxiety, adolescents and parents reported about adolescents’ evaluative fears and psychosocial impairments, and adolescents participated in a set of tasks designed to simulate social interactions with same-age, unfamiliar peers. Relative to other adolescents in the sample, adolescents high on both safety behaviors and social anxiety displayed greater psychosocial impairments, evaluative fears, and observed social skills deficits within social interactions. These findings have important implications for assessing and treating adolescent social anxiety.


2021 ◽  
Author(s):  
◽  
Miona Stamenovic

<p>The impact of multiple disabilities causes difficulties in the area of communication. Individuals with severe and multiple handicaps often have no verbal language as a result of serious physical impairments. This population may show little obvious response and it is therefore difficult to know if they are engaged and for the person him or herself to maintain engagement when involved in activities. The purpose of the study was to find out what happens in a normal music therapy session, during moments of perceived engagement. Four individuals experienced in the field of multiple disabilities were invited to take part in semi-structured interviews where they observed a half hour video of a therapist and a student with severe and multiple handicaps participating in music therapy. Data was analyzed in two steps, firstly through participants observing and explaining their reactions to video of music therapy and secondly by the researcher interviewing the participants and writing up a transcript of their commentaries about the video. The key themes that emerged in participants' descriptions of engagement during moments in music therapy suggest it is possible to observe and identify engagement as reflected in the students' non-verbal responses, such as body movement, eye contact and vocalizations.</p>


2021 ◽  
Author(s):  
◽  
Miona Stamenovic

<p>The impact of multiple disabilities causes difficulties in the area of communication. Individuals with severe and multiple handicaps often have no verbal language as a result of serious physical impairments. This population may show little obvious response and it is therefore difficult to know if they are engaged and for the person him or herself to maintain engagement when involved in activities. The purpose of the study was to find out what happens in a normal music therapy session, during moments of perceived engagement. Four individuals experienced in the field of multiple disabilities were invited to take part in semi-structured interviews where they observed a half hour video of a therapist and a student with severe and multiple handicaps participating in music therapy. Data was analyzed in two steps, firstly through participants observing and explaining their reactions to video of music therapy and secondly by the researcher interviewing the participants and writing up a transcript of their commentaries about the video. The key themes that emerged in participants' descriptions of engagement during moments in music therapy suggest it is possible to observe and identify engagement as reflected in the students' non-verbal responses, such as body movement, eye contact and vocalizations.</p>


Sign in / Sign up

Export Citation Format

Share Document