complex emotion
Recently Published Documents


TOTAL DOCUMENTS

69
(FIVE YEARS 27)

H-INDEX

12
(FIVE YEARS 2)

2021 ◽  
Vol 12 ◽  
Author(s):  
Sylwia Hyniewska ◽  
Joanna Dąbrowska ◽  
Iwona Makowska ◽  
Kamila Jankowiak-Siuda ◽  
Krystyna Rymarczyk

Atypical emotion interpretation has been widely reported in individuals with borderline personality disorder (iBPD); however, empirical studies reported mixed results so far. We suggest that discrepancies in observations of emotion interpretation by iBPD can be explained by biases related to their fear of rejection and abandonment, i.e., the three moral emotions of anger, disgust, and contempt. In this study, we hypothesized that iBPD would show a higher tendency to correctly interpret these three displays of social rejection and attribute more negative valence. A total of 28 inpatient iBPDs and 28 healthy controls were asked to judge static and dynamic facial expressions in terms of emotions, valence, and self-reported arousal evoked by the observed faces. Our results partially confirmed our expectations. The iBPD correctly interpreted the three unambiguous moral emotions. Contempt, a complex emotion with a difficulty in recognizing facial expressions, was recognized better by iBPD than by healthy controls. All negative emotions were judged more negatively by iBPD than by controls, but no difference was observed in the neutral or positive emotion. Alexithymia and anxiety trait and state levels were controlled in all analyses.


2021 ◽  
Author(s):  
◽  
Wee Kiat Tay

<p>Emotion analytics is the study of human behavior by analyzing the responses when humans experience different emotions. In this thesis, we research into emotion analytics solutions using computer vision to detect emotions from facial expressions automatically using live video.  Considering anxiety is an emotion that can lead to more serious conditions like anxiety disorders and depression, we propose 2 hypotheses to detect anxiety from facial expressions. One hypothesis is that the complex emotion “anxiety” is a subset of the basic emotion “fear”. The other hypothesis is that anxiety can be distinguished from fear by differences in head and eye motion.  We test the first hypothesis by implementing a basic emotions detector based on facial action coding system (FACS) to detect fear from videos of anxious faces. When we discover that this is not as accurate as we would like, an alternative solution based on Gabor filters is implemented. A comparison is done between the solutions and the Gabor-based solution is found to be inferior.  The second hypothesis is tested by using scatter graphs and statistical analysis of the head and eye motions of videos for fear and anxiety expressions. It is found that head pitch has significant differences between fear and anxiety.  As a conclusion to the thesis, we implement a systems software using the basic emotions detector based on FACS and evaluate the software by comparing commercials using emotions detected from facial expressions of viewers.</p>


2021 ◽  
Author(s):  
◽  
Wee Kiat Tay

<p>Emotion analytics is the study of human behavior by analyzing the responses when humans experience different emotions. In this thesis, we research into emotion analytics solutions using computer vision to detect emotions from facial expressions automatically using live video.  Considering anxiety is an emotion that can lead to more serious conditions like anxiety disorders and depression, we propose 2 hypotheses to detect anxiety from facial expressions. One hypothesis is that the complex emotion “anxiety” is a subset of the basic emotion “fear”. The other hypothesis is that anxiety can be distinguished from fear by differences in head and eye motion.  We test the first hypothesis by implementing a basic emotions detector based on facial action coding system (FACS) to detect fear from videos of anxious faces. When we discover that this is not as accurate as we would like, an alternative solution based on Gabor filters is implemented. A comparison is done between the solutions and the Gabor-based solution is found to be inferior.  The second hypothesis is tested by using scatter graphs and statistical analysis of the head and eye motions of videos for fear and anxiety expressions. It is found that head pitch has significant differences between fear and anxiety.  As a conclusion to the thesis, we implement a systems software using the basic emotions detector based on FACS and evaluate the software by comparing commercials using emotions detected from facial expressions of viewers.</p>


2021 ◽  
Author(s):  
Sudhakar Mishra ◽  
Narayanan Srinivasan ◽  
Uma Shanker Tiwary

We describe the creation of an affective film dataset for researchers interested in studying a broad spectrum of emotional experiences. Two hundred twenty-two 60-seconds long video clips were selected based on multimedia content analysis and screened in the lab with 407 participants. The participants' ratings mapped to 31 emotion categories in the first stage. Based on the selection criteria, 69 audio-visual clips were selected. These selected affective clips were then presented to 271 participants. Participants rated these clips on rating scales and categorized them into emotion categories. The affective clips were able to induce 19 basic and complex emotion categories reliably. Since the presented dataset is comprised of film clips based on both Indian and western content, the dataset can effectively be used for cross-cultural emotion research. From the dataset, researchers can select emotional movie clips based on the ratings and quantitative measures, including the reliability measures presented in this work. We also show a continuity of emotional experiences using an advanced visualisation technique to complement the existing knowledge based on V-A space with the information on how the transitions among emotion categories are taking place.


Sociology ◽  
2021 ◽  
pp. 003803852110436
Author(s):  
Hanna Kara ◽  
Sirpa Wrede

This article develops sociological knowledge on daughterhood through an analysis of how separation shapes the emotional and moral dynamics of transnational daughterhood. Building on Finch, we look at daughtering as a set of concrete social practices that constitute kinship and carry the symbolic dimension of displaying the family-like character of relationships. Within this framework, we analyse how Latin American women living in Barcelona discuss their transnational family lives and filial responsibilities. We see family as finite, evolving in the past, present and future, and develop a threefold understanding of filial love as an institution imbued with formal expectations, a strong and complex emotion, and reciprocal embodied caring. We consider persisting physical separation in migration as a circumstance that demands not only practical solutions but also ongoing moral labour that sustains transnational bonds and notions of being a ‘good enough’ daughter.


2021 ◽  
pp. 1-8
Author(s):  
Brittany Cassidy ◽  
Robert Wiley ◽  
Mattea Sim ◽  
Kurt Hugenberg

PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0256503
Author(s):  
Alfonso Semeraro ◽  
Salvatore Vilella ◽  
Giancarlo Ruffo

The increasing availability of textual corpora and data fetched from social networks is fuelling a huge production of works based on the model proposed by psychologist Robert Plutchik, often referred simply as the “Plutchik Wheel”. Related researches range from annotation tasks description to emotions detection tools. Visualisation of such emotions is traditionally carried out using the most popular layouts, as bar plots or tables, which are however sub-optimal. The classic representation of the Plutchik’s wheel follows the principles of proximity and opposition between pairs of emotions: spatial proximity in this model is also a semantic proximity, as adjacent emotions elicit a complex emotion (a primary dyad) when triggered together; spatial opposition is a semantic opposition as well, as positive emotions are opposite to negative emotions. The most common layouts fail to preserve both features, not to mention the need of visually allowing comparisons between different corpora in a blink of an eye, that is hard with basic design solutions. We introduce PyPlutchik the Pyplutchik package is available as a Github repository (http://github.com/alfonsosemeraro/pyplutchik) or through the installation commands pip or conda. For any enquiry about usage or installation feel free to contact the corresponding author, a Python module specifically designed for the visualisation of Plutchik’s emotions in texts or in corpora. PyPlutchik draws the Plutchik’s flower with each emotion petal sized after how much that emotion is detected or annotated in the corpus, also representing three degrees of intensity for each of them. Notably, PyPlutchik allows users to display also primary, secondary, tertiary and opposite dyads in a compact, intuitive way. We substantiate our claim that PyPlutchik outperforms other classic visualisations when displaying Plutchik emotions and we showcase a few examples that display our module’s most compelling features.


2021 ◽  
Vol 13 (1) ◽  
Author(s):  
Lucy L. Russell ◽  
Caroline V. Greaves ◽  
Rhian S. Convery ◽  
Jennifer Nicholas ◽  
Jason D. Warren ◽  
...  

Abstract Background Current tasks measuring social cognition are usually ‘pen and paper’ tasks, have ceiling effects and include complicated test instructions that may be difficult to understand for those with cognitive impairment. We therefore aimed to develop a set of simple, instructionless, quantitative, tasks of emotion recognition using the methodology of eye tracking, with the subsequent aim of assessing their utility in individuals with behavioural variant frontotemporal dementia (bvFTD). Methods Using the Eyelink 1000 Plus eye tracker, 18 bvFTD and 22 controls completed tasks of simple and complex emotion recognition that involved viewing four images (one target face (simple) or pair of eyes (complex) and the others non-target) followed by a target emotion word and lastly the original four images alongside the emotion word. A dwell time change score was then calculated as the main outcome measure by subtracting the percentage dwell time for the target image before the emotion word appeared away from the percentage dwell time for the target image after the emotion word appeared. All participants also underwent a standard cognitive battery and volumetric T1-weighted magnetic resonance imaging. Results Analysis using a mixed effects model showed that the average (standard deviation) mean dwell time change score in the target interest area was 35 (27)% for the control group compared with only 4 (18)% for the bvFTD group (p < 0.05) for the simple emotion recognition task, and 15 (26)% for the control group compared with only 2 (18)% for the bvFTD group (p < 0.05) for the complex emotion recognition task. Worse performance in the bvFTD group correlated with atrophy in the right ventromedial prefrontal and orbitofrontal cortices, brain regions previously implicated in social cognition. Conclusions In summary, eye tracking is a viable tool for assessing social cognition in individuals with bvFTD, being well-tolerated and able to overcome some of the problems associated with standard psychometric tasks.


2020 ◽  
Vol 17 (12) ◽  
pp. 1200-1206
Author(s):  
Seo Woo Kim ◽  
Sun-Young Moon ◽  
Wu Jeong Hwang ◽  
Silvia Kyungjin Lho ◽  
Sanghoon Oh ◽  
...  

Objective Although previous studies have reported impaired performance in the reading the mind in the eyes test (RMET), which measures complex emotion recognition abilities, in patients with schizophrenia, reports regarding individuals at clinical high risk (CHR) for psychosis have been inconsistent, mainly due to the interacting confounding effects of general cognitive abilities and age. We compared RMET performances across first-episode psychosis (FEP) patients, CHR individuals, and healthy controls (HCs) while controlling for the effects of both general cognitive abilities and age.Methods A total of 25 FEP, 41 CHR, and 44 HC subjects matched for age participated in this study. RMET performance scores were compared across the groups using analysis of variance with sex and intelligence quotient as covariates. Exploratory Pearson’s correlation analyses were performed to reveal the potential relationships of RMET scores with clinical symptom severity in the FEP and CHR groups.Results RMET performance scores were significantly lower among FEP and CHR participants than among HCs. FEP patients and CHR subjects showed comparable RMET performance scores. RMET scores were negatively correlated with Positive and Negative Syndrome Scale (PANSS) positive symptom subscale scores in the FEP patients. No significant correlation was identified between RMET scores and other clinical scale scores.Conclusion Impaired RMET performance is present from the risk stage of psychosis, which might be related to positive symptom severity in early psychosis. Longitudinal studies are necessary to confirm the stability of complex emotion recognition impairments and their relationship with social functioning in early psychosis patients.


Sign in / Sign up

Export Citation Format

Share Document