scholarly journals Reduced configural information modulates fixation patterns but does not affect emotion recognition

2021 ◽  
Author(s):  
Yu-Fang Yang ◽  
Michel-Ange Amorim ◽  
Brunet-Gouet Eric

To address the role of diagnostic features and configural information in emotion recognition, we compared fixation patterns (on eyes, nose, mouth) and recognition performance from sketched (without head contours) and photographed face counterparts. Although sketch faces supposedly induce less configural processing than photographed faces, when they convey relevant diagnostic features, recognition performance is equivalent. First fixation patterns depended on emotion. Happy mouth was the only feature that received more fixations than eyes and nose. Fixations on diagnostic features varied with stimulus type and emotion during the second fixation only. Sadness, happiness, and anger generated more fixations on eyes for sketches, suggesting a part-based perceptual strategy. Conversely, longer central fixations on photographed faces suggested more configural processing. Removal of configural information (sketched faces) did not affect emotion recognition performance, supposedly because participants used a different visual scanning strategy of part-based processing towards the eyes to compensate for the impoverished configuration.




2021 ◽  
Author(s):  
Sarah McCrackin ◽  
Francesca Capozzi ◽  
Florence Mayrand ◽  
Jelena Ristic

With widespread adoption of mask wearing, the 2020 Covid-19 pandemic highlighted a need for a deeper understanding of how facial feature obstruction affects emotion recognition. Here we asked participants (n=120) to identify disgusted, angry, sad, neutral, surprised, happy, and fearful emotions from faces with and without masks, and examined if recognition performance was related to their level of social competence and personality traits. Performance was reduced for all masked relative to unmasked emotions. Masks impacted recognition of expressions with diagnostic lower face features the most (disgust, anger) and those with diagnostic upper face features the least (fear, surprise). Recognition performance also varied at the individual level. Persons with higher overall social competence were better at identifying unmasked expressions, while persons with lower trait extraversion and higher trait agreeableness were better at recognizing masked expressions. These results reveal novel insights about the role of face features in emotion recognition and show that obscuring facial features affects social communication differently as a function of individual social competence and personality traits.



2006 ◽  
Vol 44 (12) ◽  
pp. 2437-2444 ◽  
Author(s):  
Valérian Chambon ◽  
Jean-Yves Baudouin ◽  
Nicolas Franck


2007 ◽  
Vol 97 (1) ◽  
pp. 14-27 ◽  
Author(s):  
Karine Durand ◽  
Mathieu Gallay ◽  
Alix Seigneuric ◽  
Fabrice Robichon ◽  
Jean-Yves Baudouin


2018 ◽  
Vol 32 (3) ◽  
pp. 356-365 ◽  
Author(s):  
Catarina C. Kordsachia ◽  
Izelle Labuschagne ◽  
Julie C. Stout


Perception ◽  
10.1068/p3252 ◽  
2002 ◽  
Vol 31 (10) ◽  
pp. 1221-1240 ◽  
Author(s):  
Graham J Hole ◽  
Patricia A George ◽  
Karen Eaves ◽  
Ayman Rasek

The importance of ‘configural’ processing for face recognition is now well established, but it remains unclear precisely what it entails. Through four experiments we attempted to clarify the nature of configural processing by investigating the effects of various affine transformations on the recognition of familiar faces. Experiment 1 showed that recognition was markedly impaired by inversion of faces, somewhat impaired by shearing or horizontally stretching them, but unaffected by vertical stretching of faces to twice their normal height. In experiment 2 we investigated vertical and horizontal stretching in more detail, and found no effects of either transformation. Two further experiments were performed to determine whether participants were recognising stretched faces by using configural information. Experiment 3 showed that nonglobal vertical stretching of faces (stretching either the top or the bottom half while leaving the remainder undistorted) impaired recognition, implying that configural information from the stretched part of the face was influencing the process of recognition — ie that configural processing involves global facial properties. In experiment 4 we examined the effects of Gaussian blurring on recognition of undistorted and vertically stretched faces. Faces remained recognisable even when they were both stretched and blurred, implying that participants were basing their judgments on configural information from these stimuli, rather than resorting to some strategy based on local featural details. The tolerance of spatial distortions in human face recognition suggests that the configural information used as a basis for face recognition is unlikely to involve information about the absolute position of facial features relative to each other, at least not in any simple way



2010 ◽  
Vol 69 (3) ◽  
pp. 161-167 ◽  
Author(s):  
Jisien Yang ◽  
Adrian Schwaninger

Configural processing has been considered the major contributor to the face inversion effect (FIE) in face recognition. However, most researchers have only obtained the FIE with one specific ratio of configural alteration. It remains unclear whether the ratio of configural alteration itself can mediate the occurrence of the FIE. We aimed to clarify this issue by manipulating the configural information parametrically using six different ratios, ranging from 4% to 24%. Participants were asked to judge whether a pair of faces were entirely identical or different. The paired faces that were to be compared were presented either simultaneously (Experiment 1) or sequentially (Experiment 2). Both experiments revealed that the FIE was observed only when the ratio of configural alteration was in the intermediate range. These results indicate that even though the FIE has been frequently adopted as an index to examine the underlying mechanism of face processing, the emergence of the FIE is not robust with any configural alteration but dependent on the ratio of configural alteration.



2019 ◽  
Author(s):  
Alex Bertrams ◽  
Katja Schlegel

People high in autistic-like traits have been found to have difficulties with recognizing emotions from nonverbal expressions. However, findings on the autism—emotion recognition relationship are inconsistent. In the present study, we investigated whether speeded reasoning ability (reasoning performance under time pressure) moderates the inverse relationship between autistic-like traits and emotion recognition performance. We expected the negative correlation between autistic-like traits and emotion recognition to be less strong when speeded reasoning ability was high. MTurkers (N = 217) completed the ten item version of the Autism Spectrum Quotient (AQ-10), two emotion recognition tests using videos with sound (Geneva Emotion Recognition Test, GERT-S) and pictures (Reading the Mind in the Eyes Test, RMET), and Baddeley's Grammatical Reasoning test to measure speeded reasoning. As expected, the higher the ability in speeded reasoning, the less were higher autistic-like traits related to lower emotion recognition performance. These results suggest that a high ability in making quick mental inferences may (partly) compensate for difficulties with intuitive emotion recognition related to autistic-like traits.



Sign in / Sign up

Export Citation Format

Share Document