scholarly journals Impact of Physiological Signals Acquisition in the Emotional Support Provided in Learning Scenarios

Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4520 ◽  
Author(s):  
Uria-Rivas ◽  
Rodriguez-Sanchez ◽  
Santos ◽  
Vaquero ◽  
Boticario

Physiological sensors can be used to detect changes in the emotional state of users with affective computing. This has lately been applied in the educational domain, aimed to better support learners during the learning process. For this purpose, we have developed the AICARP (Ambient Intelligence Context-aware Affective Recommender Platform) infrastructure, which detects changes in the emotional state of the user and provides personalized multisensorial support to help manage the emotional state by taking advantage of ambient intelligence features. We have developed a third version of this infrastructure, AICARP.V3, which addresses several problems detected in the data acquisition stage of the second version, (i.e., intrusion of the pulse sensor, poor resolution and low signal to noise ratio in the galvanic skin response sensor and slow response time of the temperature sensor) and extends the capabilities to integrate new actuators. This improved incorporates a new acquisition platform (shield) called PhyAS (Physiological Acquisition Shield), which reduces the number of control units to only one, and supports both gathering physiological signals with better precision and delivering multisensory feedback with more flexibility, by means of new actuators that can be added/discarded on top of just that single shield. The improvements in the quality of the acquired signals allow better recognition of the emotional states. Thereof, AICARP.V3 gives a more accurate personalized emotional support to the user, based on a rule-based approach that triggers multisensorial feedback, if necessary. This represents progress in solving an open problem: develop systems that perform as effectively as a human expert in a complex task such as the recognition of emotional states.

Sensors ◽  
2020 ◽  
Vol 20 (14) ◽  
pp. 4037
Author(s):  
Aasim Raheel ◽  
Muhammad Majid ◽  
Majdi Alnowami ◽  
Syed Muhammad Anwar

Emotion recognition has increased the potential of affective computing by getting an instant feedback from users and thereby, have a better understanding of their behavior. Physiological sensors have been used to recognize human emotions in response to audio and video content that engages single (auditory) and multiple (two: auditory and vision) human senses, respectively. In this study, human emotions were recognized using physiological signals observed in response to tactile enhanced multimedia content that engages three (tactile, vision, and auditory) human senses. The aim was to give users an enhanced real-world sensation while engaging with multimedia content. To this end, four videos were selected and synchronized with an electric fan and a heater, based on timestamps within the scenes, to generate tactile enhanced content with cold and hot air effect respectively. Physiological signals, i.e., electroencephalography (EEG), photoplethysmography (PPG), and galvanic skin response (GSR) were recorded using commercially available sensors, while experiencing these tactile enhanced videos. The precision of the acquired physiological signals (including EEG, PPG, and GSR) is enhanced using pre-processing with a Savitzky-Golay smoothing filter. Frequency domain features (rational asymmetry, differential asymmetry, and correlation) from EEG, time domain features (variance, entropy, kurtosis, and skewness) from GSR, heart rate and heart rate variability from PPG data are extracted. The K nearest neighbor classifier is applied to the extracted features to classify four (happy, relaxed, angry, and sad) emotions. Our experimental results show that among individual modalities, PPG-based features gives the highest accuracy of 78.57 % as compared to EEG- and GSR-based features. The fusion of EEG, GSR, and PPG features further improved the classification accuracy to 79.76 % (for four emotions) when interacting with tactile enhanced multimedia.


Author(s):  
Luma Tabbaa ◽  
Ryan Searle ◽  
Saber Mirzaee Bafti ◽  
Md Moinul Hossain ◽  
Jittrapol Intarasisrisawat ◽  
...  

The paper introduces a multimodal affective dataset named VREED (VR Eyes: Emotions Dataset) in which emotions were triggered using immersive 360° Video-Based Virtual Environments (360-VEs) delivered via Virtual Reality (VR) headset. Behavioural (eye tracking) and physiological signals (Electrocardiogram (ECG) and Galvanic Skin Response (GSR)) were captured, together with self-reported responses, from healthy participants (n=34) experiencing 360-VEs (n=12, 1--3 min each) selected through focus groups and a pilot trial. Statistical analysis confirmed the validity of the selected 360-VEs in eliciting the desired emotions. Preliminary machine learning analysis was carried out, demonstrating state-of-the-art performance reported in affective computing literature using non-immersive modalities. VREED is among the first multimodal VR datasets in emotion recognition using behavioural and physiological signals. VREED is made publicly available on Kaggle1. We hope that this contribution encourages other researchers to utilise VREED further to understand emotional responses in VR and ultimately enhance VR experiences design in applications where emotional elicitation plays a key role, i.e. healthcare, gaming, education, etc.


2019 ◽  
Vol 2 ◽  
pp. 1-8
Author(s):  
Kalliopi Kyriakou ◽  
Bernd Resch

Abstract. Over the last years, we have witnessed an increasing interest in urban health research using physiological sensors. There is a rich repertoire of methods for stress detection using various physiological signals and algorithms. However, most of the studies focus mainly on the analysis of the physiological signals and disregard the spatial analysis of the extracted geo-located emotions. Methodologically, the use of hotspot maps created through point density analysis dominates in previous studies, but this method may lead to inaccurate or misleading detection of high-intensity stress clusters. This paper proposes a methodology for the spatial analysis of moments of stress (MOS). In a first step, MOS are identified through a rule-based algorithm analysing galvanic skin response and skin temperature measured by low-cost wearable physiological sensors. For the spatial analysis, we introduce a MOS ratio for the geo-located detected MOS. This ratio normalises the detected MOS in nearby areas over all the available records for the area. Then, the MOS ratio is fed into a hot spot analysis to identify hot and cold spots. To validate our methodology, we carried out two real-world field studies to evaluate the accuracy of our approach. We show that the proposed approach is able to identify spatial patterns in urban areas that correspond to self-reported stress.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3760
Author(s):  
Saad Awadh Alanazi ◽  
Madallah Alruwaili ◽  
Fahad Ahmad ◽  
Alaa Alaerjan ◽  
Nasser Alshammari

The theory of modern organizations considers emotional intelligence to be the metric for tools that enable organizations to create a competitive vision. It also helps corporate leaders enthusiastically adhere to the vision and energize organizational stakeholders to accomplish the vision. In this study, the one-dimensional convolutional neural network classification model is initially employed to interpret and evaluate shifts in emotion over a period by categorizing emotional states that occur at particular moments during mutual interaction using physiological signals. The self-organizing map technique is implemented to cluster overall organizational emotions to represent organizational competitiveness. The analysis of variance test results indicates no significant difference in age and body mass index for participants exhibiting different emotions. However, a significant mean difference was observed for the blood volume pulse, galvanic skin response, skin temperature, valence, and arousal values, indicating the effectiveness of the chosen physiological sensors and their measures to analyze emotions for organizational competitiveness. We achieved 99.8% classification accuracy for emotions using the proposed technique. The study precisely identifies the emotions and locates a connection between emotional intelligence and organizational competitiveness (i.e., a positive relationship with employees augments organizational competitiveness).


2013 ◽  
Vol 2013 ◽  
pp. 1-13 ◽  
Author(s):  
Min-Ki Kim ◽  
Miyoung Kim ◽  
Eunmi Oh ◽  
Sung-Phil Kim

A growing number of affective computing researches recently developed a computer system that can recognize an emotional state of the human user to establish affective human-computer interactions. Various measures have been used to estimate emotional states, including self-report, startle response, behavioral response, autonomic measurement, and neurophysiologic measurement. Among them, inferring emotional states from electroencephalography (EEG) has received considerable attention as EEG could directly reflect emotional states with relatively low costs and simplicity. Yet, EEG-based emotional state estimation requires well-designed computational methods to extract information from complex and noisy multichannel EEG data. In this paper, we review the computational methods that have been developed to deduct EEG indices of emotion, to extract emotion-related features, or to classify EEG signals into one of many emotional states. We also propose using sequential Bayesian inference to estimate the continuous emotional state in real time. We present current challenges for building an EEG-based emotion recognition system and suggest some future directions.  


2014 ◽  
Vol 25 (4) ◽  
pp. 279-287 ◽  
Author(s):  
Stefan Hey ◽  
Panagiota Anastasopoulou ◽  
André Bideaux ◽  
Wilhelm Stork

Ambulatory assessment of emotional states as well as psychophysiological, cognitive and behavioral reactions constitutes an approach, which is increasingly being used in psychological research. Due to new developments in the field of information and communication technologies and an improved application of mobile physiological sensors, various new systems have been introduced. Methods of experience sampling allow to assess dynamic changes of subjective evaluations in real time and new sensor technologies permit a measurement of physiological responses. In addition, new technologies facilitate the interactive assessment of subjective, physiological, and behavioral data in real-time. Here, we describe these recent developments from the perspective of engineering science and discuss potential applications in the field of neuropsychology.


2017 ◽  
Vol 76 (2) ◽  
pp. 71-79 ◽  
Author(s):  
Hélène Maire ◽  
Renaud Brochard ◽  
Jean-Luc Kop ◽  
Vivien Dioux ◽  
Daniel Zagar

Abstract. This study measured the effect of emotional states on lexical decision task performance and investigated which underlying components (physiological, attentional orienting, executive, lexical, and/or strategic) are affected. We did this by assessing participants’ performance on a lexical decision task, which they completed before and after an emotional state induction task. The sequence effect, usually produced when participants repeat a task, was significantly smaller in participants who had received one of the three emotion inductions (happiness, sadness, embarrassment) than in control group participants (neutral induction). Using the diffusion model ( Ratcliff, 1978 ) to resolve the data into meaningful parameters that correspond to specific psychological components, we found that emotion induction only modulated the parameter reflecting the physiological and/or attentional orienting components, whereas the executive, lexical, and strategic components were not altered. These results suggest that emotional states have an impact on the low-level mechanisms underlying mental chronometric tasks.


2021 ◽  
Author(s):  
Natalia Albuquerque ◽  
Daniel S. Mills ◽  
Kun Guo ◽  
Anna Wilkinson ◽  
Briseida Resende

AbstractThe ability to infer emotional states and their wider consequences requires the establishment of relationships between the emotional display and subsequent actions. These abilities, together with the use of emotional information from others in social decision making, are cognitively demanding and require inferential skills that extend beyond the immediate perception of the current behaviour of another individual. They may include predictions of the significance of the emotional states being expressed. These abilities were previously believed to be exclusive to primates. In this study, we presented adult domestic dogs with a social interaction between two unfamiliar people, which could be positive, negative or neutral. After passively witnessing the actors engaging silently with each other and with the environment, dogs were given the opportunity to approach a food resource that varied in accessibility. We found that the available emotional information was more relevant than the motivation of the actors (i.e. giving something or receiving something) in predicting the dogs’ responses. Thus, dogs were able to access implicit information from the actors’ emotional states and appropriately use the affective information to make context-dependent decisions. The findings demonstrate that a non-human animal can actively acquire information from emotional expressions, infer some form of emotional state and use this functionally to make decisions.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2381
Author(s):  
Jaewon Lee ◽  
Hyeonjeong Lee ◽  
Miyoung Shin

Mental stress can lead to traffic accidents by reducing a driver’s concentration or increasing fatigue while driving. In recent years, demand for methods to detect drivers’ stress in advance to prevent dangerous situations increased. Thus, we propose a novel method for detecting driving stress using nonlinear representations of short-term (30 s or less) physiological signals for multimodal convolutional neural networks (CNNs). Specifically, from hand/foot galvanic skin response (HGSR, FGSR) and heart rate (HR) short-term input signals, first, we generate corresponding two-dimensional nonlinear representations called continuous recurrence plots (Cont-RPs). Second, from the Cont-RPs, we use multimodal CNNs to automatically extract FGSR, HGSR, and HR signal representative features that can effectively differentiate between stressed and relaxed states. Lastly, we concatenate the three extracted features into one integrated representation vector, which we feed to a fully connected layer to perform classification. For the evaluation, we use a public stress dataset collected from actual driving environments. Experimental results show that the proposed method demonstrates superior performance for 30-s signals, with an overall accuracy of 95.67%, an approximately 2.5–3% improvement compared with that of previous works. Additionally, for 10-s signals, the proposed method achieves 92.33% classification accuracy, which is similar to or better than the performance of other methods using long-term signals (over 100 s).


Semiotica ◽  
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Amitash Ojha ◽  
Charles Forceville ◽  
Bipin Indurkhya

Abstract Both mainstream and art comics often use various flourishes surrounding characters’ heads. These so-called “pictorial runes” (also called “emanata”) help convey the emotional states of the characters. In this paper, using (manipulated) panels from Western and Indian comic albums as well as neutral emoticons and basic shapes in different colors, we focus on the following two issues: (a) whether runes increase the awareness in comics readers about the emotional state of the character; and (b) whether a correspondence can be found between the types of runes (twirls, spirals, droplets, and spikes) and specific emotions. Our results show that runes help communicate emotion. Although no one-to-one correspondence was found between the tested runes and specific emotions, it was found that droplets and spikes indicate generic emotions, spirals indicate negative emotions, and twirls indicate confusion and dizziness.


Sign in / Sign up

Export Citation Format

Share Document