Judgment of Facial Expressions of Emotion as a Function of Exposure Time

1984 ◽  
Vol 59 (1) ◽  
pp. 147-150 ◽  
Author(s):  
Gilles Kirouac ◽  
François Y. Doré

The purpose of this experiment was to study the accuracy of judgment of facial expressions of emotions that were displayed for very brief exposure times. Twenty university students were shown facial stimuli that were presented for durations ranging from 10 to 50 msec. The data showed that accuracy of judgment reached a fairly high level even at very brief exposure times and that human observers are especially competent to process very rapid changes in facial appearance.

2020 ◽  
Author(s):  
Fernando Ferreira-Santos ◽  
Mariana R. Pereira ◽  
Tiago O. Paiva ◽  
Pedro R. Almeida ◽  
Eva C. Martins ◽  
...  

The behavioral and electrophysiological study of the emotional intensity of facial expressions of emotions has relied on image processing techniques termed ‘morphing’ to generate realistic facial stimuli in which emotional intensity can be manipulated. This is achieved by blending neutral and emotional facial displays and treating the percent of morphing between the two stimuli as an objective measure of emotional intensity. Here we argue that the percentage of morphing between stimuli does not provide an objective measure of emotional intensity and present supporting evidence from affective ratings and neural (event-related potential) responses. We show that 50% morphs created from high or moderate arousal stimuli differ in subjective and neural responses in a sensible way: 50% morphs are perceived as having approximately half of the emotional intensity of the original stimuli, but if the original stimuli differed in emotional intensity to begin with, then so will the morphs. We suggest a re-examination of previous studies that used percentage of morphing as a measure of emotional intensity and highlight the value of more careful experimental control of emotional stimuli and inclusion of proper manipulation checks.


1986 ◽  
Vol 62 (2) ◽  
pp. 419-423 ◽  
Author(s):  
Gilles Kirouac ◽  
Martin Bouchard ◽  
Andrée St-Pierre

The purpose of this study was to measure the capacity of human subjects to match facial expressions of emotions and behavioral categories that represented the motivational states they are supposed to illustrate. 100 university students were shown facial stimuli they had to classify using ethological behavioral categories. The results showed that accuracy of judgment was over-all lower than what was usually found when fundamental emotional categories were used. The data also indicated that the relation between emotional expressions and behavioral tendencies was more complex than expected.


2020 ◽  
Author(s):  
Thomas Murray ◽  
Justin O'Brien ◽  
Noam Sagiv ◽  
Lucia Garrido

Face shape and surface textures are two important cues that aid in the perception of facial expressions of emotion. Additionally, this perception is also influenced by high-level emotion concepts. Across two studies, we use representational similarity analysis to investigate the relative roles of shape, surface, and conceptual information in the perception, categorisation, and neural representation of facial expressions. In Study 1, 50 participants completed a perceptual task designed to measure the perceptual similarity of expression pairs, and a categorical task designed to measure the confusability between expression pairs when assigning emotion labels to a face. We used representational similarity analysis and constructed three models of the similarities between emotions using distinct information. Two models were based on stimulus-based cues (face shapes and surface textures) and one model was based on emotion concepts. Using multiple linear regression, we found that behaviour during both tasks was related with the similarity of emotion concepts. The model based on face shapes was more related with behaviour in the perceptual task than in the categorical, and the model based on surface textures was more related with behaviour in the categorical than the perceptual task. In Study 2, 30 participants viewed facial expressions while undergoing fMRI, allowing for the measurement of brain representational geometries of facial expressions of emotion in three core face-responsive regions (the Fusiform Face Area, Occipital Face Area, and Superior Temporal Sulcus), and a region involved in theory of mind (Medial Prefrontal Cortex). Across all four regions, the representational distances between facial expression pairs were related to the similarities of emotion concepts, but not to either of the stimulus-based cues. Together, these results highlight the important top-down influence of high-level emotion concepts both in behavioural tasks and in the neural representation of facial expressions.


2016 ◽  
Vol 37 (1) ◽  
pp. 16-23 ◽  
Author(s):  
Chit Yuen Yi ◽  
Matthew W. E. Murry ◽  
Amy L. Gentzler

Abstract. Past research suggests that transient mood influences the perception of facial expressions of emotion, but relatively little is known about how trait-level emotionality (i.e., temperament) may influence emotion perception or interact with mood in this process. Consequently, we extended earlier work by examining how temperamental dimensions of negative emotionality and extraversion were associated with the perception accuracy and perceived intensity of three basic emotions and how the trait-level temperamental effect interacted with state-level self-reported mood in a sample of 88 adults (27 men, 18–51 years of age). The results indicated that higher levels of negative mood were associated with higher perception accuracy of angry and sad facial expressions, and higher levels of perceived intensity of anger. For perceived intensity of sadness, negative mood was associated with lower levels of perceived intensity, whereas negative emotionality was associated with higher levels of perceived intensity of sadness. Overall, our findings added to the limited literature on adult temperament and emotion perception.


2006 ◽  
Author(s):  
Mark E. Hastings ◽  
June P. Tangney ◽  
Jeffrey Stuewig

2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.


2018 ◽  
Author(s):  
Karel Kleisner ◽  
Šimon Pokorný ◽  
Selahattin Adil Saribay

In present research, we took advantage of geometric morphometrics to propose a data-driven method for estimating the individual degree of facial typicality/distinctiveness for cross-cultural (and other cross-group) comparisons. Looking like a stranger in one’s home culture may be somewhat stressful. The same facial appearance, however, might become advantageous within an outgroup population. To address this fit between facial appearance and cultural setting, we propose a simple measure of distinctiveness/typicality based on position of an individual along the axis connecting the facial averages of two populations under comparison. The more distant a face is from its ingroup population mean towards the outgroup mean the more distinct it is (vis-à-vis the ingroup) and the more it resembles the outgroup standards. We compared this new measure with an alternative measure based on distance from outgroup mean. The new measure showed stronger association with rated facial distinctiveness than distance from outgroup mean. Subsequently, we manipulated facial stimuli to reflect different levels of ingroup-outgroup distinctiveness and tested them in one of the target cultures. Perceivers were able to successfully distinguish outgroup from ingroup faces in a two-alternative forced-choice task. There was also some evidence that this task was harder when the two faces were closer along the axis connecting the facial averages from the two cultures. Future directions and potential applications of our proposed approach are discussed.


Sign in / Sign up

Export Citation Format

Share Document