Emotion recognition from physiological signals and video games to detect personality traits

Author(s):  
M. Callejas-Cuervo ◽  
L.A. Martínez-Tejada ◽  
A.C. Alarcón-Aldana

This paper presents a system that allows for the identification of two values: arousal and valence, which represent the degree of stimulation in a subject, using Russell’s model of affect as a reference. To identify emotions, a step-by-step structure is used, which, based on statistical data from physiological signal metrics, generates the representative arousal value (direct correlation); from the PANAS questionnaire, the system generates the valence value (inverse correlation), as a first approximation to the techniques of emotion recognition without the use of artificial intelligence. The system gathers information concerning arousal activity from a subject using the following metrics: beats per minute (BPM), heart rate variability (HRV), the number of galvanic skin response (GSR) peaks in the skin conductance response (SCR) and forearm contraction time, using three physiological signals (Electrocardiogram - ECG, Galvanic Skin Response - GSR, Electromyography - EMG).

Author(s):  
Luma Tabbaa ◽  
Ryan Searle ◽  
Saber Mirzaee Bafti ◽  
Md Moinul Hossain ◽  
Jittrapol Intarasisrisawat ◽  
...  

The paper introduces a multimodal affective dataset named VREED (VR Eyes: Emotions Dataset) in which emotions were triggered using immersive 360° Video-Based Virtual Environments (360-VEs) delivered via Virtual Reality (VR) headset. Behavioural (eye tracking) and physiological signals (Electrocardiogram (ECG) and Galvanic Skin Response (GSR)) were captured, together with self-reported responses, from healthy participants (n=34) experiencing 360-VEs (n=12, 1--3 min each) selected through focus groups and a pilot trial. Statistical analysis confirmed the validity of the selected 360-VEs in eliciting the desired emotions. Preliminary machine learning analysis was carried out, demonstrating state-of-the-art performance reported in affective computing literature using non-immersive modalities. VREED is among the first multimodal VR datasets in emotion recognition using behavioural and physiological signals. VREED is made publicly available on Kaggle1. We hope that this contribution encourages other researchers to utilise VREED further to understand emotional responses in VR and ultimately enhance VR experiences design in applications where emotional elicitation plays a key role, i.e. healthcare, gaming, education, etc.


Author(s):  
G. Shivakumar ◽  
P.A. Vijaya

It is essential to distinguish between an imposter and a genuine emotion in certain applications. To facilitate this, the number of features is increased by incorporating physiological signals. Physiological changes in the human body cannot be pretended. Human emotional behavior changes the heart rate, skin resistance, finger temperature, EEG etc. These physiological signal parameters can be measured and included as the final feature vector. The network is to be trained considering all the feature points as inputs with a radial basis activation function at the hidden layer and a linear activation function at the output layer. The two physiological parameters galvanic skin response (GSR) and finger tip temperature (FTT) that are predominant in deciding the emotion of a person are considered in this chapter. The measurements made are transmitted to LabVIEW add-on card for further data processing and analysis. The results obtained are nearer to the reality with a good measure of accuracy.


Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 770
Author(s):  
Alessandro Tonacci ◽  
Lucia Billeci ◽  
Irene Di Mambro ◽  
Roberto Marangoni ◽  
Chiara Sanmartin ◽  
...  

Wearable sensors are nowadays largely employed to assess physiological signals derived from the human body without representing a burden in terms of obtrusiveness. One of the most intriguing fields of application for such systems include the assessment of physiological responses to sensory stimuli. In this specific regard, it is not yet known which are the main psychophysiological drivers of olfactory-related pleasantness, as the current literature has demonstrated the relationship between odor familiarity and odor valence, but has not clarified the consequentiality between the two domains. Here, we enrolled a group of university students to whom olfactory training lasting 3 months was administered. Thanks to the analysis of electrocardiogram (ECG) and galvanic skin response (GSR) signals at the beginning and at the end of the training period, we observed different autonomic responses, with higher parasympathetically-mediated response at the end of the period with respect to the first evaluation. This possibly suggests that an increased familiarity to the proposed stimuli would lead to a higher tendency towards relaxation. Such results could suggest potential applications to other domains, including personalized treatments based on odors and foods in neuropsychiatric and eating disorders.


Emotion recognition is alluring considerable interest among the researchers. Emotions are discovered by facial, speech, gesture, posture and physiological signals. Physiological signals are a plausible mechanism to recognize emotion using human-computer interaction. The objective of this paper is to put forth the recognition of emotions using physiological signals. Various emotion elicitation protocols, feature extraction techniques, classification methods that aim at recognizing emotions from physiological signals are discussed here. Wrist Pulse Signal is also discussed to fill the lacunae of the other physiological signal for emotion detection. Working on basic as well as non-basic human emotion and human-computer interface will make the system robust.


Author(s):  
Rama Chaudhary ◽  
Ram Avtar Jaswal

In modern time, the human-machine interaction technology has been developed so much for recognizing human emotional states depending on physiological signals. The emotional states of human can be recognized by using facial expressions, but sometimes it doesn’t give accurate results. For example, if we detect the accuracy of facial expression of sad person, then it will not give fully satisfied result because sad expression also include frustration, irritation, anger, etc. therefore, it will not be possible to determine the particular expression. Therefore, emotion recognition using Electroencephalogram (EEG), Electrocardiogram (ECG) has gained so much attraction because these are based on brain and heart signals respectively. So, after analyzing all the factors, it is decided to recognize emotional states based on EEG using DEAP Dataset. So that, the better accuracy can be achieved.


2013 ◽  
Vol 380-384 ◽  
pp. 3750-3753 ◽  
Author(s):  
Chun Yan Nie ◽  
Rui Li ◽  
Ju Wang

Changes of physiological signals are affected by human emotions, but also the emotional fluctuations are reflected by the body's variation of physiological signal's feature. Physiological signal is a non-linear signal ,nonlinear dynamics and biomedical engineering ,which based on chaos theory, providing us a new method for studying on the parameters of these complex physiological signals which can hardly described by the classical theory. This paper shows physiological emotion signal recognition system based on the chaotic characteristics, and than describes some current applications of chaotic characteristics for multiple physiological signals on emotional recognition.


Sensors ◽  
2020 ◽  
Vol 20 (14) ◽  
pp. 4037
Author(s):  
Aasim Raheel ◽  
Muhammad Majid ◽  
Majdi Alnowami ◽  
Syed Muhammad Anwar

Emotion recognition has increased the potential of affective computing by getting an instant feedback from users and thereby, have a better understanding of their behavior. Physiological sensors have been used to recognize human emotions in response to audio and video content that engages single (auditory) and multiple (two: auditory and vision) human senses, respectively. In this study, human emotions were recognized using physiological signals observed in response to tactile enhanced multimedia content that engages three (tactile, vision, and auditory) human senses. The aim was to give users an enhanced real-world sensation while engaging with multimedia content. To this end, four videos were selected and synchronized with an electric fan and a heater, based on timestamps within the scenes, to generate tactile enhanced content with cold and hot air effect respectively. Physiological signals, i.e., electroencephalography (EEG), photoplethysmography (PPG), and galvanic skin response (GSR) were recorded using commercially available sensors, while experiencing these tactile enhanced videos. The precision of the acquired physiological signals (including EEG, PPG, and GSR) is enhanced using pre-processing with a Savitzky-Golay smoothing filter. Frequency domain features (rational asymmetry, differential asymmetry, and correlation) from EEG, time domain features (variance, entropy, kurtosis, and skewness) from GSR, heart rate and heart rate variability from PPG data are extracted. The K nearest neighbor classifier is applied to the extracted features to classify four (happy, relaxed, angry, and sad) emotions. Our experimental results show that among individual modalities, PPG-based features gives the highest accuracy of 78.57 % as compared to EEG- and GSR-based features. The fusion of EEG, GSR, and PPG features further improved the classification accuracy to 79.76 % (for four emotions) when interacting with tactile enhanced multimedia.


2020 ◽  
Vol 30 (04) ◽  
pp. 2050013 ◽  
Author(s):  
Mikel Val-Calvo ◽  
José Ramón Álvarez-Sánchez ◽  
Jose Manuel Ferrández-Vicente ◽  
Alejandro Díaz-Morcillo ◽  
Eduardo Fernández-Jover

Emotion estimation systems based on brain and physiological signals such as electro encephalography (EEG), blood-volume pressure (BVP), and galvanic skin response (GSR) are gaining special attention in recent years due to the possibilities they offer. The field of human–robot interactions (HRIs) could benefit from a broadened understanding of the brain and physiological emotion encoding, together with the use of lightweight software and cheap wearable devices, and thus improve the capabilities of robots to fully engage with the users emotional reactions. In this paper, a previously developed methodology for real-time emotion estimation aimed for its use in the field of HRI is tested under realistic circumstances using a self-generated database created using dynamically evoked emotions. Other state-of-the-art, real-time approaches address emotion estimation using constant stimuli to facilitate the analysis of the evoked responses, remaining far from real scenarios since emotions are dynamically evoked. The proposed approach studies the feasibility of the emotion estimation methodology previously developed, under an experimentation paradigm that imitates a more realistic scenario involving dynamically evoked emotions by using a dramatic film as the experimental paradigm. The emotion estimation methodology has proved to perform on real-time constraints while maintaining high accuracy on emotion estimation when using the self-produced dynamically evoked emotions multi-signal database.


Sign in / Sign up

Export Citation Format

Share Document