arousal and valence
Recently Published Documents


TOTAL DOCUMENTS

102
(FIVE YEARS 34)

H-INDEX

19
(FIVE YEARS 2)

Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7466
Author(s):  
Jachin Edward Pousson ◽  
Aleksandras Voicikas ◽  
Valdis Bernhofs ◽  
Evaldas Pipinis ◽  
Lana Burmistrova ◽  
...  

The research on neural correlates of intentional emotion communication by the music performer is still limited. In this study, we attempted to evaluate EEG patterns recorded from musicians who were instructed to perform a simple piano score while manipulating their manner of play to express specific contrasting emotions and self-rate the emotion they reflected on the scales of arousal and valence. In the emotional playing task, participants were instructed to improvise variations in a manner by which the targeted emotion is communicated. In contrast, in the neutral playing task, participants were asked to play the same piece precisely as written to obtain data for control over general patterns of motor and sensory activation during playing. The spectral analysis of the signal was applied as an initial step to be able to connect findings to the wider field of music-emotion research. The experimental contrast of emotional playing vs. neutral playing was employed to probe brain activity patterns differentially involved in distinct emotional states. The tasks of emotional and neutral playing differed considerably with respect to the state of intended-to-transfer emotion arousal and valence levels. The EEG activity differences were observed between distressed/excited and neutral/depressed/relaxed playing.


2021 ◽  
Author(s):  
Vasileios Skaramagkas ◽  
Emmanouil Ktistakis ◽  
Dimitris Manousos ◽  
Nikolaos S. Tachos ◽  
Eleni Kazantzaki ◽  
...  

2021 ◽  
Vol 7 (2) ◽  
pp. 767-770
Author(s):  
Himanshu Kumar ◽  
Nagarajan Ganapathy ◽  
Subha D. Puthankattil ◽  
Ramakrishnan Swaminathan

Abstract Electroencephalography (EEG) based emotion recognition is a widely preferred technique due to its noninvasiveness. Also, frontal region-specific EEG signals have been associated with emotional processing. Feature reductionbased optimized machine learning methods can improve the automated analysis of frontal EEG signals. In this work, an attempt is made to classify emotional states using entropybased features and Bayesian optimized random forest. For this, the EEG signals of prefrontal and frontal regions (Fp1, Fp2, Fz, F3, and F4) are obtained from an online public database. The signals are decomposed into five frequency bands, namely delta (1-4 Hz), theta (4-8 Hz), alpha (8-13 Hz), beta (14-30 Hz), and gamma (30-45 Hz). Three entropy features, namely Dispersion Entropy (DE), Sample Entropy (SE), and Permutation Entropy (PE), are extracted and are dimensionally reduced using Principal Component Analysis (PCA). Further, the reduced features are applied to the Bayesian optimized random forest for the classification. The results show that the DE in the gamma band and SE in the alpha band exhibit a statistically significant (p < 0.05) difference for classifying arousal and valence emotional states. The selected features from PCA yield an F-measure of 73.24% for arousal and 46.98% for valence emotional states. Further, the combination of all features yields a higher F-measure of 48.13% for valence emotional states. The proposed method is capable of handling multicomponent variations of frontal region-specific EEG signals. Particularly the combination of selected features could be useful to characterize arousal and valence emotional states.


Author(s):  
Eszter Ferentzi ◽  
Luca Vig ◽  
Mats Julin Lindkjølen ◽  
Markus Erling Lien ◽  
Ferenc Köteles

AbstractOur aim was to conceptually replicate the findings of previous empirical studies showing that people with higher cardiac interoceptive accuracy experience more intense emotions. Apart of the mental heartbeat tracking task of Schandry, Hungarian (n = 46, 76.0% female, mean age 22.28 ± 2.228) and Norwegian (n = 50, 60.0% female, mean age 24.66 ± 3.048) participants rated the arousal and valence evoked by positive, neutral and negative pictures. Multivariate repeated analysis of variance (applying both frequentist and Bayesian approaches) did not reveal any connection between heartbeat perception scores and the subjective ratings (i.e., arousal and valence) of the pictures in any of the two groups. The lack of the expected association between cardioceptive accuracy and arousal might partly be explained by the methodological differences between previous studies and this one; for example, we did not split or preselected the sample based on the performance on the Schandry task and applied a relatively strict instruction (i.e., by encouraging to count felt heartbeats only, and to report zero if no sensations were detected).


2021 ◽  
Vol 83 (6) ◽  
pp. 53-61
Author(s):  
Mahfuzah Mustafa ◽  
Zarith Liyana Zahari ◽  
Rafiuddin Abdubrani

The connection between music and human are very synonyms because music could reduce stress. The state of stress could be measured using EEG signal, an electroencephalogram (EEG) measurement which contains an arousal and valence index value. In previous studies, it is found that the Matthew Correlation Coefficient (MCC) performance accuracy is of 85±5%. The arousal indicates strong emotion, and valence indicates positive and negative degree of emotion. Arousal and valence values could be used to measure the accuracy performance. This research focuses on the enhance MCC parameter equation based on arousal and valence values to perform the maximum accuracy percentage in the frequency domain and time-frequency domain analysis. Twenty-one features were used to improve the significance of feature extraction results and the investigated arousal and valence value. The substantial feature extraction involved alpha, beta, delta and theta frequency bands in measuring the arousal and valence index formula. Based on the results, the arousal and valance index is accepted to be applied as parameters in the MCC equations. However, in certain cases, the improvement of the MCC parameter is required to achieve a high accuracy percentage and this research proposed Matthew correlation coefficient advanced (MCCA) in order to improve the performance result by using a six sigma method. In conclusion, the MCCA equation is established to enhance the existing MCC parameter to improve the accuracy percentage up to 99.9% for the arousal and valence index.


2021 ◽  
Author(s):  
Colin Conwell ◽  
Daniel Graham ◽  
Edward A. Vessel

How well can we predict human affective responses to an image from the purely perceptual response of a machine trained only on canonical computer vision tasks? We address this question with a large-scale survey of deep neural networks deployed to predict aesthetic judgment, arousal, and valence for images from multiple categories (objects, faces, landscapes, artwork) across two distinct datasets. Importantly, we use the features of these models without any additional learning. We find these features sufficient to predict average ratings of aesthetics, arousal, and valence with remarkably high accuracy across the board -- in many cases beyond the ratings presaged by even the most representative human subjects. Across our benchmarked models, which include Imagenet-trained and randomly-initialized convolutional and transformer architectures, as well as the encoders of the Taskonomy project, a few further trends become evident. One, predictive power is not a given: Randomly-initialized models categorically fail to predict the same quantities of variance as trained models. Two, object and scene classification training produce the best overall features for prediction. Three, aesthetic judgments are the most predictable of the affective responses, superseding arousal and valence. This last trend, especially, highlights the possibility that aesthetic judgment may be a form of ‘elemental affect’ embedded in the perceptual apparatus -- and directly available from the statistics of natural images. The contribution of such a mechanism could help explain why our otherwise affectless machines predict affect so accurately.


2021 ◽  
pp. 030573562110243
Author(s):  
Ashley Warmbrodt ◽  
Renee Timmers ◽  
Rory Kirk

This study explored how lyrics, participant-selected music, and emotion trajectory impact self-reported emotional (happiness, sadness, arousal, and valence) and physiological (heart, respiration, and skin conductance rates) responses. Participants were matched (based on sex, age, musicianship, and lyric preference) and assigned to a lyric or instrumental group. Each participant experienced one emotion trajectory (happy-sad or sad-happy), with alternating self- and experimenter-selected jazz music. Emotion trajectory had a significant effect on self-reports, where participants in the sad-happy trajectory reported significantly more sadness overall compared to participants in the happy-sad trajectory. There were also several interaction effects between the independent variables, which indicate the relevance of order as well as differences in processing musical emotions depending on whether music is instrumental or contains lyrics.


2021 ◽  
Vol 445 ◽  
pp. 194-205
Author(s):  
Xuguang Zhang ◽  
Xiuxin Yang ◽  
Weiguang Zhang ◽  
Gongfa Li ◽  
Hui Yu

2021 ◽  
Vol 10 (19) ◽  
Author(s):  
Fernando Souza Ferreira ◽  
Gabriela Zubaran de Azevedo Pizzato ◽  
Jocelise Jacques de Jacques ◽  
Júlio Carlos de Souza van der Linden

Ideas to boost the redemocratization of urban spaces are urgent and essential. This study aims to evaluate user’s experiences with the app “Hurbanize: share ideas for the city”. The research presents four moments and separates thirty participants in three groups. Data collection uses Pick-A-Mood, explanation of emotions and user experience qualifications. Data analysis is done by arousal and valence model of mood, using appraisal theory and nine sources of product emotion. The results present increase in arousal and valence as well as perceptions of affinity, displeasure, credit, claim, apprehension and enjoyment towards the Hurbanize app.


Sign in / Sign up

Export Citation Format

Share Document