scholarly journals Simultaneous prediction of valence / arousal and emotion categories and its application in an HRC scenario

2021 ◽  
Vol 12 (1) ◽  
pp. 57-73
Author(s):  
Sebastian Handrich ◽  
Laslo Dinges ◽  
Ayoub Al-Hamadi ◽  
Philipp Werner ◽  
Frerk Saxen ◽  
...  

AbstractWe address the problem of facial expression analysis. The proposed approach predicts both basic emotion and valence/arousal values as a continuous measure for the emotional state. Experimental results including cross-database evaluation on the AffectNet, Aff-Wild, and AFEW dataset shows that our approach predicts emotion categories and valence/arousal values with high accuracies and that the simultaneous learning of discrete categories and continuous values improves the prediction of both. In addition, we use our approach to measure the emotional states of users in an Human-Robot-Collaboration scenario (HRC), show how these emotional states are affected by multiple difficulties that arise for the test subjects, and examine how different feedback mechanisms counteract negative emotions users experience while interacting with a robot system.

2017 ◽  
Vol 76 (2) ◽  
pp. 71-79 ◽  
Author(s):  
Hélène Maire ◽  
Renaud Brochard ◽  
Jean-Luc Kop ◽  
Vivien Dioux ◽  
Daniel Zagar

Abstract. This study measured the effect of emotional states on lexical decision task performance and investigated which underlying components (physiological, attentional orienting, executive, lexical, and/or strategic) are affected. We did this by assessing participants’ performance on a lexical decision task, which they completed before and after an emotional state induction task. The sequence effect, usually produced when participants repeat a task, was significantly smaller in participants who had received one of the three emotion inductions (happiness, sadness, embarrassment) than in control group participants (neutral induction). Using the diffusion model ( Ratcliff, 1978 ) to resolve the data into meaningful parameters that correspond to specific psychological components, we found that emotion induction only modulated the parameter reflecting the physiological and/or attentional orienting components, whereas the executive, lexical, and strategic components were not altered. These results suggest that emotional states have an impact on the low-level mechanisms underlying mental chronometric tasks.


2021 ◽  
Author(s):  
Natalia Albuquerque ◽  
Daniel S. Mills ◽  
Kun Guo ◽  
Anna Wilkinson ◽  
Briseida Resende

AbstractThe ability to infer emotional states and their wider consequences requires the establishment of relationships between the emotional display and subsequent actions. These abilities, together with the use of emotional information from others in social decision making, are cognitively demanding and require inferential skills that extend beyond the immediate perception of the current behaviour of another individual. They may include predictions of the significance of the emotional states being expressed. These abilities were previously believed to be exclusive to primates. In this study, we presented adult domestic dogs with a social interaction between two unfamiliar people, which could be positive, negative or neutral. After passively witnessing the actors engaging silently with each other and with the environment, dogs were given the opportunity to approach a food resource that varied in accessibility. We found that the available emotional information was more relevant than the motivation of the actors (i.e. giving something or receiving something) in predicting the dogs’ responses. Thus, dogs were able to access implicit information from the actors’ emotional states and appropriately use the affective information to make context-dependent decisions. The findings demonstrate that a non-human animal can actively acquire information from emotional expressions, infer some form of emotional state and use this functionally to make decisions.


Semiotica ◽  
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Amitash Ojha ◽  
Charles Forceville ◽  
Bipin Indurkhya

Abstract Both mainstream and art comics often use various flourishes surrounding characters’ heads. These so-called “pictorial runes” (also called “emanata”) help convey the emotional states of the characters. In this paper, using (manipulated) panels from Western and Indian comic albums as well as neutral emoticons and basic shapes in different colors, we focus on the following two issues: (a) whether runes increase the awareness in comics readers about the emotional state of the character; and (b) whether a correspondence can be found between the types of runes (twirls, spirals, droplets, and spikes) and specific emotions. Our results show that runes help communicate emotion. Although no one-to-one correspondence was found between the tested runes and specific emotions, it was found that droplets and spikes indicate generic emotions, spirals indicate negative emotions, and twirls indicate confusion and dizziness.


2022 ◽  
pp. 164-167
Author(s):  
N. A. Ofitserova

The article considers the restaurant business from the point of view of not only the entrepreneurial aspect, but also the service aspect, which is fundamental. The reasons why people visit restaurants have been revealed. In addition to physical need, restaurants are an element of cognition and a way of experiencing positive emotions. The importance of the restaurant business in shaping people’s positive emotional state has been formulated. Two forms of emotional labor of an employee and the influence of emotional states on work performance have been highlighted. The role of emotional intelligence and communicative competence in customer satisfaction with a restaurant visit has been determined. The importance of developing emotional intelligence has been concluded. Recommendations for its development has been formulated. 


2021 ◽  
Author(s):  
Talieh Seyed Tabtabae

Automatic Emotion Recognition (AER) is an emerging research area in the Human-Computer Interaction (HCI) field. As Computers are becoming more and more popular every day, the study of interaction between humans (users) and computers is catching more attention. In order to have a more natural and friendly interface between humans and computers, it would be beneficial to give computers the ability to recognize situations the same way a human does. Equipped with an emotion recognition system, computers will be able to recognize their users' emotional state and show the appropriate reaction to that. In today's HCI systems, machines can recognize the speaker and also content of the speech, using speech recognition and speaker identification techniques. If machines are equipped with emotion recognition techniques, they can also know "how it is said" to react more appropriately, and make the interaction more natural. One of the most important human communication channels is the auditory channel which carries speech and vocal intonation. In fact people can perceive each other's emotional state by the way they talk. Therefore in this work the speech signals are analyzed in order to set up an automatic system which recognizes the human emotional state. Six discrete emotional states have been considered and categorized in this research: anger, happiness, fear, surprise, sadness, and disgust. A set of novel spectral features are proposed in this contribution. Two approaches are applied and the results are compared. In the first approach, all the acoustic features are extracted from consequent frames along the speech signals. The statistical values of features are considered to constitute the features vectors. Suport Vector Machine (SVM), which is a relatively new approach in the field of machine learning is used to classify the emotional states. In the second approach, spectral features are extracted from non-overlapping logarithmically-spaced frequency sub-bands. In order to make use of all the extracted information, sequence discriminant SVMs are adopted. The empirical results show that the employed techniques are very promising.


Author(s):  
Penny Baillie ◽  
Mark Toleman ◽  
Dickson Lukose

Interacting with intelligence in an ever-changing environment calls for exceptional performances from artificial beings. One mechanism explored to produce intuitive-like behavior in artificial intelligence applications is emotion. This chapter examines the engineering of a mechanism that synthesizes and processes an artificial agent’s internal emotional states: the Affective Space. Through use of the affective space, an agent can predict the effect certain behaviors will have on its emotional state and, in turn, decide how to behave. Furthermore, an agent can use the emotions produced from its behavior to update its beliefs about particular entities and events. This chapter explores the psychological theory used to structure the affective space, the way in which the strength of emotional states can be diminished over time, how emotions influence an agent’s perception, and the way in which an agent can migrate from one emotional state to another.


2019 ◽  
Vol 72 (4) ◽  
pp. 562-567
Author(s):  
Volodymyr K. Likhachov ◽  
Yanina V. Shymanska ◽  
Yulia S. Savelieva ◽  
Viktoriya L. Vashchenko ◽  
Ludmyla М. Dobrovolska

Introduction: During pregnancy in the body of a healthy woman there are physiological and psychological changes that contribute to the bearing a child and prepare the female for future labour and motherhood. In women who experience failure at the stage of fertilization or during pregnancy, as a result of prolonged negative emotional states, psycho-emotional stress develops. The aim of the research was to study the psycho-emotional state of women with infertility in history, whose pregnancy resulted from extracorporal fertilization (IVF), and to develop methods for reducing their anxiety. Materials and methods: At the first stage, the initial psycho-emotional state of 60 women in the second trimester, whose pregnancy resulted from IVF (Group I), was studied; the control group consisted of 20 healthy women with a physiological course of pregnancy (Group II). At the second stage, 10 art therapy exercises with a requestioning of pregnant women from Group I were conducted for improving their psycho-emotional state. Results: Women of Group I had a high level of both situational anxiety (SA) and the personal one (PA). The prevalent type of the psychological component of gestational dominant was anxiety and euphoric types (58.3%). In one third of women with burdened gynecological history examined mild or masked depression was diagnosed. 43 pregnant women from Group I used a method of psychocorrection – art therapy, which included colouring “antistress” pictures of perinatal topic, making flowers from paper and creating a collage of dreams. Conclusions: After the art therapy course, a high level of SA (from 46.5% to 7.0%) and OA (from 48.8% to 32.6%) decreased, the index of the optimal type of the psychological component of gestational dominant increased from 25.6% to 53.5%. The number of women without depression increased from 62.8% to 93%.


2013 ◽  
Vol 3 (1) ◽  
pp. 31-46 ◽  
Author(s):  
Andrew Pressey ◽  
Laura Salciuviene ◽  
Stuart Barnes

The purpose of the study is to examine the effects of emotional states on higher-order need attainment in the computer-mediated environment. A survey data were collected from 404 adult visitors within the Second Life of virtual worlds. The findings suggest that the emotional states exert significant effects on attainment of higher-order needs (i.e. belongingness, esteem and self-actualization); the flow emotional state exerts a greater effect on attaining higher-order needs than the remaining emotional states of anxiety, confusion and apathy. Companies with presence in the Second Life of virtual worlds will be able to make more informed decisions when directing their efforts to enhance visitors’ emotional experiences in their virtual islands.


Beverages ◽  
2020 ◽  
Vol 6 (2) ◽  
pp. 27 ◽  
Author(s):  
Samuel J. Kessler ◽  
Funan Jiang ◽  
R. Andrew Hurley

In the late 1970s, analysis of facial expressions to unveil emotional states began to grow and flourish along with new technologies and software advances. Researchers have always been able to document what consumers do, but understanding how consumers feel at a specific moment in time is an important part of the product development puzzle. Because of this, biometric testing methods have been used in numerous studies, as researchers have worked to develop a more comprehensive understanding of consumers. Despite the many articles on automated facial expression analysis (AFEA), literature is limited in regard to food and beverage studies. There are no standards to guide researchers in setting up materials, processing data, or conducting a study, and there are few, if any, compilations of the studies that have been performed to determine whether any methodologies work better than others or what trends have been found. Through a systematic Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) review, 38 articles were found that were relevant to the research goals. The authors identified AFEA study methods that have worked and those that have not been as successful and noted any trends of particular importance. Key takeaways include a listing of commercial AFEA software, experimental methods used within the PRISMA analysis, and a comprehensive explanation of the critical methods and practices of the studies analyzed. Key information was analyzed and compared to determine effects on the study outcomes. Through analyzing the various studies, suggestions and guidance for conducting and analyzing data from AFEA experiments are discussed.


2013 ◽  
Vol 2013 ◽  
pp. 1-13 ◽  
Author(s):  
Santiago-Omar Caballero-Morales

An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR) system was built with Hidden Markov Models (HMMs), where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness). Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR’s output for the sentence. With this approach, accuracy of 87–100% was achieved for the recognition of emotional state of Mexican Spanish speech.


Sign in / Sign up

Export Citation Format

Share Document