Temporal shifts in eye gaze and facial expressions independently contribute to the perceived attractiveness of unfamiliar faces

2018 ◽  
Vol 26 (10) ◽  
pp. 831-852 ◽  
Author(s):  
Pik Ki Ho ◽  
Andy Woods ◽  
Fiona N. Newell
Perception ◽  
2020 ◽  
Vol 49 (3) ◽  
pp. 330-356
Author(s):  
Pik Ki Ho ◽  
Fiona N. Newell

We investigated whether the perceived attractiveness of expressive faces was influenced by head turn and eye gaze towards or away from the observer. In all experiments, happy faces were consistently rated as more attractive than angry faces. A head turn towards the observer, whereby a full-face view was shown, was associated with relatively higher attractiveness ratings when gaze direction was aligned with face view (Experiment 1). However, preference for full-face views of happy faces was not affected by gaze shifts towards or away from the observer (Experiment 2a). In Experiment 3, the relative duration of each face view (front-facing or averted at 15°) during a head turn away or towards the observer was manipulated. There was benefit on attractiveness ratings for happy faces shown for a longer duration from the front view, regardless of the direction of head turn. Our findings support previous studies indicating a preference for positive expressions on attractiveness judgements, which is further enhanced by the front views of faces, whether presented during a head turn or shown statically. In sum, our findings imply a complex interaction between cues of social attention, indicated by the view of the face shown, and reward on attractiveness judgements of unfamiliar faces.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


Author(s):  
Kristiina Jokinen ◽  
Päivi Majaranta

In this chapter, the authors explore possibilities to use novel face and gaze tracking technology in educational applications, especially in interactive teaching agents for second language learning. They focus on non-verbal feedback that provides information about how well the speaker has understood the presented information, and how well the interaction is progressing. Such feedback is important in interactive applications in general, and in educational systems, it is effectively used to construct a shared context in which learning can take place: the teacher can use feedback signals to tailor the presentation appropriate for the student. This chapter surveys previous work, relevant technology, and future prospects for such multimodal interactive systems. It also sketches future educational systems which encourage the students to learn foreign languages in a natural and inclusive manner, via participating in interaction using natural communication strategies.


Autism ◽  
2020 ◽  
pp. 136236132095169 ◽  
Author(s):  
Roser Cañigueral ◽  
Jamie A Ward ◽  
Antonia F de C Hamilton

Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial motion patterns in autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial displays as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies. Lay abstract When we are communicating with other people, we exchange a variety of social signals through eye gaze and facial expressions. However, coordinated exchanges of these social signals can only happen when people involved in the interaction are able to see each other. Although previous studies report that autistic individuals have difficulties in using eye gaze and facial expressions during social interactions, evidence from tasks that involve real face-to-face conversations is scarce and mixed. Here, we investigate how eye gaze and facial expressions of typical and high-functioning autistic individuals are modulated by the belief in being seen by another person, and by being in a face-to-face interaction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video (no belief in being seen, no face-to-face), video-call (belief in being seen, no face-to-face) and face-to-face (belief in being seen and face-to-face). Typical participants gazed less to the confederate and made more facial expressions when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial expression patterns in autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial expressions as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies.


1993 ◽  
Vol 162 (5) ◽  
pp. 695-698 ◽  
Author(s):  
Andrew W. Young ◽  
Ian Reid ◽  
Simon Wright ◽  
Deborah J. Hellawell

Investigations of two cases of the Capgras delusion found that both patients showed face-processing impairments encompassing identification of familiar faces, recognition of emotional facial expressions, and matching of unfamiliar faces. In neither case was there any impairment of recognition memory for words. These findings are consistent with the idea that the basis of the Capgras delusion lies in damage to neuro-anatomical pathways responsible for appropriate emotional reactions to familiar visual stimuli. The delusion would then represent the patient's attempt to make sense of the fact that these visual stimuli no longer have appropriate affective significance.


Author(s):  
Priya Seshadri ◽  
Youyi Bi ◽  
Jaykishan Bhatia ◽  
Ross Simons ◽  
Jeffrey Hartley ◽  
...  

This study is the first stage of a research program aimed at understanding differences in how people process 2D and 3D automotive stimuli, using psychophysiological tools such as galvanic skin response (GSR), eye tracking, electroencephalography (EEG), and facial expressions coding, along with respondent ratings. The current study uses just one measure, eye tracking, and one stimulus format, 2D realistic renderings of vehicles, to reveal where people expect to find information about brand and other industry-relevant topics, such as sportiness. The eye-gaze data showed differences in the percentage of fixation time that people spent on different views of cars while evaluating the “Brand” and the degree to which they looked “Sporty/Conservative”, “Calm/Exciting”, and “Basic/Luxurious”. The results of this work can give designers insights on where they can invest their design efforts when considering brand and styling cues.


2019 ◽  
Vol 29 (10) ◽  
pp. 1441-1451 ◽  
Author(s):  
Melina Nicole Kyranides ◽  
Kostas A. Fanti ◽  
Maria Petridou ◽  
Eva R. Kimonis

AbstractIndividuals with callous-unemotional (CU) traits show deficits in facial emotion recognition. According to preliminary research, this impairment may be due to attentional neglect to peoples’ eyes when evaluating emotionally expressive faces. However, it is unknown whether this atypical processing pattern is unique to established variants of CU traits or modifiable with intervention. This study examined facial affect recognition and gaze patterns among individuals (N = 80; M age = 19.95, SD = 1.01 years; 50% female) with primary vs secondary CU variants. These groups were identified based on repeated measurements of conduct problems, CU traits, and anxiety assessed in adolescence and adulthood. Accuracy and number of fixations on areas of interest (forehead, eyes, and mouth) while viewing six dynamic emotions were assessed. A visual probe was used to direct attention to various parts of the face. Individuals with primary and secondary CU traits were less accurate than controls in recognizing facial expressions across all emotions. Those identified in the low-anxious primary-CU group showed reduced overall fixations to fearful and painful facial expressions compared to those in the high-anxious secondary-CU group. This difference was not specific to a region of the face (i.e. eyes or mouth). Findings point to the importance of investigating both accuracy and eye gaze fixations, since individuals in the primary and secondary groups were only differentiated in the way they attended to specific facial expression. These findings have implications for differentiated interventions focused on improving facial emotion recognition with regard to attending and correctly identifying emotions.


Sign in / Sign up

Export Citation Format

Share Document