Body emotion recognition disproportionately depends on vertical orientations during childhood

2017 ◽  
Vol 42 (2) ◽  
pp. 278-283
Author(s):  
Benjamin Balas ◽  
Amanda Auen ◽  
Alyson Saville ◽  
Jamie Schmidt

Children’s ability to recognize emotional expressions from faces and bodies develops during childhood. However, the low-level features that support accurate body emotion recognition during development have not been well characterized. This is in marked contrast to facial emotion recognition, which is known to depend upon specific spatial frequency and orientation sub-bands during adulthood, biases that develop during childhood. Here, we examined whether children’s reliance on vertical vs. horizontal orientation energy for recognizing emotional expressions in static images of bodies changed during middle childhood (5 to 10 years old). We found that while children of all ages had an adult-like bias favoring vertical orientation energy, this effect was larger at younger ages. We conclude that in terms of information use, a key feature of the development of emotion recognition is improved performance with sub-optimal features for recognition – that is, learning to use less diagnostic features of the image is a slower process than learning to use more useful features.

2019 ◽  
Vol 25 (08) ◽  
pp. 884-889 ◽  
Author(s):  
Sally A. Grace ◽  
Wei Lin Toh ◽  
Ben Buchanan ◽  
David J. Castle ◽  
Susan L. Rossell

Abstract Objectives: Patients with body dysmorphic disorder (BDD) have difficulty in recognising facial emotions, and there is evidence to suggest that there is a specific deficit in identifying negative facial emotions, such as sadness and anger. Methods: This study investigated facial emotion recognition in 19 individuals with BDD compared with 21 healthy control participants who completed a facial emotion recognition task, in which they were asked to identify emotional expressions portrayed in neutral, happy, sad, fearful, or angry faces. Results: Compared to the healthy control participants, the BDD patients were generally less accurate in identifying all facial emotions but showed specific deficits for negative emotions. The BDD group made significantly more errors when identifying neutral, angry, and sad faces than healthy controls; and were significantly slower at identifying neutral, angry, and happy faces. Conclusions: These findings add to previous face-processing literature in BDD, suggesting deficits in identifying negative facial emotions. There are treatment implications as future interventions would do well to target such deficits.


2019 ◽  
Vol 8 (4) ◽  
pp. 4351-4354

This paper presents the idea related to automated live facial emotion recognition through image processing and artificial intelligence (AI) techniques. It is a challenging task for a computer vision to recognize as same as humans through AI. Face detection plays a vital role in emotion recognition. Emotions are classified as happy, sad, disgust, angry, neutral, fear, and surprise. Other aspects such as speech, eye contact, frequency of the voice, and heartbeat are considered. Nowadays face recognition is more efficient and used for many real-time applications due to security purposes. We detect emotion by scanning (static) images or with the (dynamic) recording. Features extracting can be done like eyes, nose, and mouth for face detection. The convolutional neural network (CNN) algorithm follows steps as max-pooling (maximum feature extraction) and flattening.


2016 ◽  
Vol 16 (12) ◽  
pp. 1409
Author(s):  
Jamie Schmidt ◽  
Amanda Auen ◽  
Benjamin Balas

Autism ◽  
2019 ◽  
Vol 24 (1) ◽  
pp. 258-262 ◽  
Author(s):  
Melissa H Black ◽  
Nigel TM Chen ◽  
Ottmar V Lipp ◽  
Sven Bölte ◽  
Sonya Girdler

While altered gaze behaviour during facial emotion recognition has been observed in autistic individuals, there remains marked inconsistency in findings, with the majority of previous research focused towards the processing of basic emotional expressions. There is a need to examine whether atypical gaze during facial emotion recognition extends to more complex emotional expressions, which are experienced as part of everyday social functioning. The eye gaze of 20 autistic and 20 IQ-matched neurotypical adults was examined during a facial emotion recognition task of complex, dynamic emotion displays. Autistic adults fixated longer on the mouth region when viewing complex emotions compared to neurotypical adults, indicating that altered prioritization of visual information may contribute to facial emotion recognition impairment. Results confirm the need for more ecologically valid stimuli for the elucidation of the mechanisms underlying facial emotion recognition difficulty in autistic individuals.


2013 ◽  
Vol 61 (1) ◽  
pp. 7-15 ◽  
Author(s):  
Daniel Dittrich ◽  
Gregor Domes ◽  
Susi Loebel ◽  
Christoph Berger ◽  
Carsten Spitzer ◽  
...  

Die vorliegende Studie untersucht die Hypothese eines mit Alexithymie assoziierten Defizits beim Erkennen emotionaler Gesichtsaudrücke an einer klinischen Population. Darüber hinaus werden Hypothesen zur Bedeutung spezifischer Emotionsqualitäten sowie zu Gender-Unterschieden getestet. 68 ambulante und stationäre psychiatrische Patienten (44 Frauen und 24 Männer) wurden mit der Toronto-Alexithymie-Skala (TAS-20), der Montgomery-Åsberg Depression Scale (MADRS), der Symptom-Check-List (SCL-90-R) und der Emotional Expression Multimorph Task (EEMT) untersucht. Als Stimuli des Gesichtererkennungsparadigmas dienten Gesichtsausdrücke von Basisemotionen nach Ekman und Friesen, die zu Sequenzen mit sich graduell steigernder Ausdrucksstärke angeordnet waren. Mittels multipler Regressionsanalyse untersuchten wir die Assoziation von TAS-20 Punktzahl und facial emotion recognition (FER). Während sich für die Gesamtstichprobe und den männlichen Stichprobenteil kein signifikanter Zusammenhang zwischen TAS-20-Punktzahl und FER zeigte, sahen wir im weiblichen Stichprobenteil durch die TAS-20 Punktzahl eine signifikante Prädiktion der Gesamtfehlerzahl (β = .38, t = 2.055, p < 0.05) und den Fehlern im Erkennen der Emotionen Wut und Ekel (Wut: β = .40, t = 2.240, p < 0.05, Ekel: β = .41, t = 2.214, p < 0.05). Für wütende Gesichter betrug die Varianzaufklärung durch die TAS-20-Punktzahl 13.3 %, für angeekelte Gesichter 19.7 %. Kein Zusammenhang bestand zwischen der Zeit, nach der die Probanden die emotionalen Sequenzen stoppten, um ihre Bewertung abzugeben (Antwortlatenz) und Alexithymie. Die Ergebnisse der Arbeit unterstützen das Vorliegen eines mit Alexithymie assoziierten Defizits im Erkennen emotionaler Gesichtsausdrücke bei weiblchen Probanden in einer heterogenen, klinischen Stichprobe. Dieses Defizit könnte die Schwierigkeiten Hochalexithymer im Bereich sozialer Interaktionen zumindest teilweise begründen und so eine Prädisposition für psychische sowie psychosomatische Erkrankungen erklären.


Sign in / Sign up

Export Citation Format

Share Document