On evidence for a dozen new basic emotions: A methodological critique.

Emotion ◽  
2020 ◽  
Author(s):  
Dolichan Kollareth ◽  
John Esposito ◽  
Yiran Ma ◽  
Hiram Brownell ◽  
James A. Russell
Keyword(s):  
2016 ◽  
Vol 37 (1) ◽  
pp. 16-23 ◽  
Author(s):  
Chit Yuen Yi ◽  
Matthew W. E. Murry ◽  
Amy L. Gentzler

Abstract. Past research suggests that transient mood influences the perception of facial expressions of emotion, but relatively little is known about how trait-level emotionality (i.e., temperament) may influence emotion perception or interact with mood in this process. Consequently, we extended earlier work by examining how temperamental dimensions of negative emotionality and extraversion were associated with the perception accuracy and perceived intensity of three basic emotions and how the trait-level temperamental effect interacted with state-level self-reported mood in a sample of 88 adults (27 men, 18–51 years of age). The results indicated that higher levels of negative mood were associated with higher perception accuracy of angry and sad facial expressions, and higher levels of perceived intensity of anger. For perceived intensity of sadness, negative mood was associated with lower levels of perceived intensity, whereas negative emotionality was associated with higher levels of perceived intensity of sadness. Overall, our findings added to the limited literature on adult temperament and emotion perception.


Author(s):  
Leland van den Daele ◽  
Ashley Yates ◽  
Sharon Rae Jenkins

Abstract. This project compared the relative performance of professional dancers and nondancers on the Music Apperception Test (MAT; van den Daele, 2014 ), then compared dancers’ performance on the MAT with that on the Thematic Apperception Test (TAT; Murray, 1943 ). The MAT asks respondents to “tell a story to the music” in compositions written to represent basic emotions. Dancers had significantly shorter response latency and were more fluent in storytelling than a comparison group matched for gender and age. Criterion-based evaluation of dancers’ narratives found narrative emotion consistent with music written to portray the emotion, with the majority integrating movement, sensation, and imagery. Approximately half the dancers were significantly more fluent on the MAT than the TAT, while the other half were significantly more fluent on the TAT than the MAT. Dancers who were more fluent on the MAT had a higher proportion of narratives that integrated movement and imagery compared with those more fluent on the TAT. The results were interpreted as consistent with differences observed in neurological studies of auditory and visual processing, educational studies of modality preference, and the cognitive style literature. The MAT provides an assessment tool to complement visually based performance tests in personality appraisal.


2019 ◽  
Author(s):  
Editage Insights
Keyword(s):  

2021 ◽  
Vol 5 (3) ◽  
pp. 13
Author(s):  
Heting Wang ◽  
Vidya Gaddy ◽  
James Ross Beveridge ◽  
Francisco R. Ortega

The role of affect has been long studied in human–computer interactions. Unlike previous studies that focused on seven basic emotions, an avatar named Diana was introduced who expresses a higher level of emotional intelligence. To adapt to the users various affects during interaction, Diana simulates emotions with dynamic facial expressions. When two people collaborated to build blocks, their affects were recognized and labeled using the Affdex SDK and a descriptive analysis was provided. When participants turned to collaborate with Diana, their subjective responses were collected and the length of completion was recorded. Three modes of Diana were involved: a flat-faced Diana, a Diana that used mimicry facial expressions, and a Diana that used emotionally responsive facial expressions. Twenty-one responses were collected through a five-point Likert scale questionnaire and the NASA TLX. Results from questionnaires were not statistically different. However, the emotionally responsive Diana obtained more positive responses, and people spent the longest time with the mimicry Diana. In post-study comments, most participants perceived facial expressions on Diana’s face as natural, four mentioned uncomfortable feelings caused by the Uncanny Valley effect.


Author(s):  
Md. Sham-E-Ansari ◽  
Shaminaj Towfika Disha ◽  
Atiqul Islam Chowdhury ◽  
Md. Khairul Hasan

Author(s):  
Eleonora Cannoni ◽  
Giuliana Pinto ◽  
Anna Silvia Bombi

AbstractThis study was aimed at verifying if children introduce emotional expressions in their drawings of human faces, and if a preferential expression exists; we also wanted to verify if children’s pictorial choices change with increasing age. To this end we examined the human figure drawings made by 160 boys and 160 girls, equally divided in 4 age groups: 6–7; 8–9; 10–11; 12–13 years; mean ages (SD in parentheses) were: 83,30 (6,54); 106,14 (7,16) 130,49 (8,26); 155,40 (6,66). Drawings were collected with the Draw-a-Man test instructions, i.e. without mentioning an emotional characterization. In the light of data from previous studies of emotion drawing on request, and the literature about preferred emotional expressions, we expected that an emotion would be portrayed even by the younger participants, and that the preferred emotion would be happiness. We also expected that with the improving ability to keep into account both mouth and eyes appearance, other expressions would be found besides the smiling face. Data were submitted to non-parametric tests to compare the frequencies of expressions (absolute and by age) and the frequencies of visual cues (absolute and by age and expressions). The results confirmed that only a small number of faces were expressionless, and that the most frequent emotion was happiness. However, with increasing age this representation gave way to a variety of basic emotions (sadness, fear, anger, surprise), whose representation may depend from the ability to modify the shapes of both eyes and mouth and changing communicative aims of the child.


Sign in / Sign up

Export Citation Format

Share Document