Toddlers Categorically Perceive Emotional Facial Expressions Along a Happy–Sad Continuum

Perception ◽  
2020 ◽  
Vol 49 (8) ◽  
pp. 822-834
Author(s):  
Vivian Lee ◽  
Natasha Da Silva ◽  
M. D. Rutherford

Adults perceive a continuum of emotional facial expressions categorically rather than continuously. Categorical perception is thought to be adaptive and functional, allowing for inferences that inform behavior. To date, studies have demonstrated categorical perception of some emotional facial expressions in infants. However, a recent study reported that 12-month-olds infants do not perceive facial emotional expressions categorically across a happy–sad continuum. In contrast, toddlers at 3.5 years of age appear to use categorical perception along the happy–sad continuum. Using a novel paradigm that employed the use of a looking-time discrimination task and an explicit identification task, this study measured 26-month-old’s identification of faces and ability to discriminate between faces along a happy–sad continuum. Results suggest that 26-month-olds perceive facial expressions categorically along the happy–sad continuum.

2021 ◽  
Vol 15 ◽  
Author(s):  
E. Darcy Burgund

Major theories of hemisphere asymmetries in facial expression processing predict right hemisphere dominance for negative facial expressions of disgust, fear, and sadness, however, some studies observe left hemisphere dominance for one or more of these expressions. Research suggests that tasks requiring the identification of six basic emotional facial expressions (angry, disgusted, fearful, happy, sad, and surprised) are more likely to produce left hemisphere involvement than tasks that do not require expression identification. The present research investigated this possibility in two experiments that presented six basic emotional facial expressions to the right or left hemisphere using a divided-visual field paradigm. In Experiment 1, participants identified emotional expressions by pushing a key corresponding to one of six labels. In Experiment 2, participants detected emotional expressions by pushing a key corresponding to whether an expression was emotional or not. In line with predictions, fearful facial expressions exhibited a left hemisphere advantage during the identification task but not during the detection task. In contrast to predictions, sad expressions exhibited a left hemisphere advantage during both identification and detection tasks. In addition, happy facial expressions exhibited a left hemisphere advantage during the detection task but not during the identification task. Only angry facial expressions exhibited a right hemisphere advantage, and this was only observed when data from both experiments were combined. Together, results highlight the influence of task demands on hemisphere asymmetries in facial expression processing and suggest a greater role for the left hemisphere in negative expressions than predicted by previous theories.


2007 ◽  
Vol 38 (10) ◽  
pp. 1475-1483 ◽  
Author(s):  
K. S. Kendler ◽  
L. J. Halberstadt ◽  
F. Butera ◽  
J. Myers ◽  
T. Bouchard ◽  
...  

BackgroundWhile the role of genetic factors in self-report measures of emotion has been frequently studied, we know little about the degree to which genetic factors influence emotional facial expressions.MethodTwenty-eight pairs of monozygotic (MZ) and dizygotic (DZ) twins from the Minnesota Study of Twins Reared Apart were shown three emotion-inducing films and their facial responses recorded. These recordings were blindly scored by trained raters. Ranked correlations between twins were calculated controlling for age and sex.ResultsTwin pairs were significantly correlated for facial expressions of general positive emotions, happiness, surprise and anger, but not for general negative emotions, sadness, or disgust or average emotional intensity. MZ pairs (n=18) were more correlated than DZ pairs (n=10) for most but not all emotional expressions.ConclusionsSince these twin pairs had minimal contact with each other prior to testing, these results support significant genetic effects on the facial display of at least some human emotions in response to standardized stimuli. The small sample size resulted in estimated twin correlations with very wide confidence intervals.


2002 ◽  
Vol 14 (2) ◽  
pp. 210-227 ◽  
Author(s):  
S. Campanella ◽  
P. Quinet ◽  
R. Bruyer ◽  
M. Crommelinck ◽  
J.-M. Guerit

Behavioral studies have shown that two different morphed faces perceived as reflecting the same emotional expression are harder to discriminate than two faces considered as two different ones. This advantage of between-categorical differences compared with within-categorical ones is classically referred as the categorical perception effect. The temporal course of this effect on fear and happiness facial expressions has been explored through event-related potentials (ERPs). Three kinds of pairs were presented in a delayed same–different matching task: (1) two different morphed faces perceived as the same emotional expression (within-categorical differences), (2) two other ones reflecting two different emotions (between-categorical differences), and (3) two identical morphed faces (same faces for methodological purpose). Following the second face onset in the pair, the amplitude of the bilateral occipito-temporal negativities (N170) and of the vertex positive potential (P150 or VPP) was reduced for within and same pairs relative to between pairs. This suggests a repetition priming effect. We also observed a modulation of the P3b wave, as the amplitude of the responses for the between pairs was higher than for the within and same pairs. These results indicate that the categorical perception of human facial emotional expressions has a perceptual origin in the bilateral occipito-temporal regions, while typical prior studies found emotion-modulated ERP components considerably later.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2020 ◽  
Author(s):  
Sjoerd Stuit ◽  
Timo Kootstra ◽  
David Terburg ◽  
Carlijn van den Boomen ◽  
Maarten van der Smagt ◽  
...  

Abstract Emotional facial expressions are important visual communication signals that indicate a sender’s intent and emotional state to an observer. As such, it is not surprising that reactions to different expressions are thought to be automatic and independent of awareness. What is surprising, is that studies show inconsistent results concerning such automatic reactions, particularly when using different face stimuli. We argue that automatic reactions to facial expressions can be better explained, and better understood, in terms of quantitative descriptions of their visual features rather than in terms of the semantic labels (e.g. angry) of the expressions. Here, we focused on overall spatial frequency (SF) and localized Histograms of Oriented Gradients (HOG) features. We used machine learning classification to reveal the SF and HOG features that are sufficient for classification of the first selected face out of two simultaneously presented faces. In other words, we show which visual features predict selection between two faces. Interestingly, the identified features serve as better predictors than the semantic label of the expressions. We therefore propose that our modelling approach can further specify which visual features drive the behavioural effects related to emotional expressions, which can help solve the inconsistencies found in this line of research.


2002 ◽  
Vol 8 (1) ◽  
pp. 130-135 ◽  
Author(s):  
JOCELYN M. KEILLOR ◽  
ANNA M. BARRETT ◽  
GREGORY P. CRUCIAN ◽  
SARAH KORTENKAMP, ◽  
KENNETH M. HEILMAN

The facial feedback hypothesis suggests that facial expressions are either necessary or sufficient to produce emotional experience. Researchers have noted that the ideal test of the necessity aspect of this hypothesis would be an evaluation of emotional experience in a patient suffering from a bilateral facial paralysis; however, this condition is rare and no such report has been documented. We examined the role of facial expressions in the determination of emotion by studying a patient (F.P.) suffering from a bilateral facial paralysis. Despite her inability to convey emotions through facial expressions, F.P. reported normal emotional experience. When F.P. viewed emotionally evocative slides her reactions were not dampened relative to the normative sample. F.P. retained her ability to detect, discriminate, and image emotional expressions. These findings are not consistent with theories stating that feedback from an active face is necessary to experience emotion, or to process emotional facial expressions. (JINS, 2002, 8, 130–135.)


2018 ◽  
Vol 24 (7) ◽  
pp. 673-683 ◽  
Author(s):  
Frédérike Carrier-Toutant ◽  
Samuel Guay ◽  
Christelle Beaulieu ◽  
Édith Léveillé ◽  
Alexandre Turcotte-Giroux ◽  
...  

AbstractObjectives: Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Methods: Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Results: Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. Conclusions: These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1–11)


2015 ◽  
Vol 156 ◽  
pp. 267-274 ◽  
Author(s):  
Fabien D’Hondt ◽  
Philippe de Timary ◽  
Yaelle Bruneau ◽  
Pierre Maurage

2010 ◽  
Vol 1 (3) ◽  
Author(s):  
Roy Kessels ◽  
Pieter Spee ◽  
Angelique Hendriks

AbstractPrevious studies have shown deficits in the perception of static emotional facial expressions in individuals with autism spectrum disorders (ASD), but results are inconclusive. Possibly, using dynamic facial stimuli expressing emotions at different levels of intensities may produce more robust results, since these resemble the expression of emotions in daily life to a greater extent. 30 Young adolescents with high-functioning ASD (IQ>85) and 30 age- and intelligence-matched controls (ages between 12 and 15) performed the Emotion Recognition Task, in which morphs were presented on a computer screen, depicting facial expressions of the six basic emotions (happiness, disgust, fear, anger, surprise and sadness) at nine levels of emotional intensity (20–100%). The results showed no overall group difference on the ERT, apart from a slightly worse performance on the perception of the emotions fear (p<0.03) and disgust (p<0.05). No interaction was found between intensity level of the emotions and group. High-functioning individuals with ASD perform similar to matched controls on the perception of dynamic facial emotional expressions, even at low intensities of emotional expression. These findings are in agreement with other recent studies showing that emotion perception deficits in high-functioning ASD may be less pronounced than previously thought.


Sign in / Sign up

Export Citation Format

Share Document