Categorical perception of facial expressions of emotion: Evidence from multidimensional scaling

2001 ◽  
Vol 15 (5) ◽  
pp. 633-658 ◽  
Author(s):  
David Bimler ◽  
John Kirkland
Perception ◽  
10.1068/p3155 ◽  
2001 ◽  
Vol 30 (9) ◽  
pp. 1115-1125 ◽  
Author(s):  
Eleni Kotsoni ◽  
Michelle de Haan ◽  
Mark H Johnson

Recent research indicates that adults show categorical perception of facial expressions of emotion. It is not known whether this is a basic characteristic of perception that is present from the earliest weeks of life, or whether it is one that emerges more gradually with experience in perceiving and interpreting expressions. We report two experiments designed to investigate whether young infants, like adults, show categorical perception of facial expressions. 7-month-old infants were shown photographic quality continua of interpolated (morphed) facial expressions derived from two prototypes of fear and happiness. In the first experiment, we used a visual-preference technique to identify the infants' category boundary between happiness and fear. In the second experiment, we used a combined familiarisation – visual-preference technique to compare infants' discrimination of pairs of expressions that were equally physically different but that did or did not cross the emotion-category boundary. The results suggest that 7-month-old infants (i) show evidence of categorical perception of facial expressions of emotion, and (ii) show persistent interest in looking at fearful expressions.


2006 ◽  
Vol 9 (1) ◽  
pp. 19-31 ◽  
Author(s):  
David L. Bimler ◽  
Galina V. Paramei

The present study investigates the perception of facial expressions of emotion, and explores the relation between the configural properties of expressions and their subjective attribution. Stimuli were a male and a female series of morphed facial expressions, interpolated between prototypes of seven emotions (happiness, sadness, fear, anger, surprise and disgust, and neutral) from Ekman and Friesen (1976). Topographical properties of the stimuli were quantified using the Facial Expression Measurement (FACEM) scheme. Perceived dissimilarities between the emotional expressions were elicited using a sorting procedure and processed with multidimensional scaling. Four dimensions were retained in the reconstructed facial-expression space, with positive and negative expressions opposed along D1, while the other three dimensions were interpreted as affective attributes distinguishing clusters of expressions categorized as “Surprise-Fear,” “Anger,” and “Disgust.” Significant relationships were found between these affective attributes and objective facial measures of the stimuli. The findings support a componential explanatory scheme for expression processing, wherein each component of a facial stimulus conveys an affective value separable from its context, rather than a categorical-gestalt scheme. The findings further suggest that configural information is closely involved in the decoding of affective attributes of facial expressions. Configural measures are also suggested as a common ground for dimensional as well as categorical perception of emotional faces.


2000 ◽  
Vol 90 (1) ◽  
pp. 291-298 ◽  
Author(s):  
Tokihiro Ogawa ◽  
Naoto Suzuki

In our 1999 report, we examined robustness of a two-dimensional structure of facial expressions of emotion under the condition of some perceptual ambiguity, using a stereoscope. The current study aimed to replicate and extend the previous work by adding facial photographs of different persons and by measuring participants' perception of stereoscopically presented faces. Multidimensional scaling provided a two-dimensional configuration of facial expressions comparable with the previous studies. Although binocular rivalry was a less frequent phenomenon, it was suggested that the distances between facial expressions in the derived space were a contributing factor in eliciting binocular rivalry.


Perception ◽  
1997 ◽  
Vol 26 (5) ◽  
pp. 613-626 ◽  
Author(s):  
Mary Katsikitis

Photographs (study 1) or line-drawing representations (study 2) of posed facial expressions and a list of emotion words (happiness, surprise, fear, disgust, anger, sadness, neutral) were presented to two groups of observers who were asked to match the photographs or line drawings, respectively, with the emotion categories provided. A multidimensional-scaling procedure was applied to the judgment data. Two dimensions were revealed; pleasantness – unpleasantness and upper-face – lower-face dominance. Furthermore, the similarity shown by the two-dimensional structures derived first from the judgments of photographs and second from the line drawings suggests that line drawings are a viable alternative to photographs in facial-expression research.


2016 ◽  
Vol 37 (1) ◽  
pp. 16-23 ◽  
Author(s):  
Chit Yuen Yi ◽  
Matthew W. E. Murry ◽  
Amy L. Gentzler

Abstract. Past research suggests that transient mood influences the perception of facial expressions of emotion, but relatively little is known about how trait-level emotionality (i.e., temperament) may influence emotion perception or interact with mood in this process. Consequently, we extended earlier work by examining how temperamental dimensions of negative emotionality and extraversion were associated with the perception accuracy and perceived intensity of three basic emotions and how the trait-level temperamental effect interacted with state-level self-reported mood in a sample of 88 adults (27 men, 18–51 years of age). The results indicated that higher levels of negative mood were associated with higher perception accuracy of angry and sad facial expressions, and higher levels of perceived intensity of anger. For perceived intensity of sadness, negative mood was associated with lower levels of perceived intensity, whereas negative emotionality was associated with higher levels of perceived intensity of sadness. Overall, our findings added to the limited literature on adult temperament and emotion perception.


2006 ◽  
Author(s):  
Mark E. Hastings ◽  
June P. Tangney ◽  
Jeffrey Stuewig

2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.


Sign in / Sign up

Export Citation Format

Share Document