scholarly journals Decoding Music-Evoked Emotions in the Auditory and Motor Cortex

2020 ◽  
Author(s):  
Vesa Putkinen ◽  
Sanaz Nazari-Farsani ◽  
Kerttu Seppälä ◽  
Tomi Karjalainen ◽  
Lihua Sun ◽  
...  

Abstract Music can induce strong subjective experience of emotions, but it is debated whether these responses engage the same neural circuits as emotions elicited by biologically significant events. We examined the functional neural basis of music-induced emotions in a large sample (n = 102) of subjects who listened to emotionally engaging (happy, sad, fearful, and tender) pieces of instrumental music while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). Ratings of the four categorical emotions and liking were used to predict hemodynamic responses in general linear model (GLM) analysis of the fMRI data. Multivariate pattern analysis (MVPA) was used to reveal discrete neural signatures of the four categories of music-induced emotions. To map neural circuits governing non-musical emotions, the subjects were scanned while viewing short emotionally evocative film clips. The GLM revealed that most emotions were associated with activity in the auditory, somatosensory, and motor cortices, cingulate gyrus, insula, and precuneus. Fear and liking also engaged the amygdala. In contrast, the film clips strongly activated limbic and cortical regions implicated in emotional processing. MVPA revealed that activity in the auditory cortex and primary motor cortices reliably discriminated the emotion categories. Our results indicate that different music-induced basic emotions have distinct representations in regions supporting auditory processing, motor control, and interoception but do not strongly rely on limbic and medial prefrontal regions critical for emotions with survival value.

2020 ◽  
Author(s):  
Vesa Putkinen ◽  
Sanaz Nazari-Farsani ◽  
Kerttu Seppälä ◽  
Tomi Karjalainen ◽  
Lihua Sun ◽  
...  

AbstractMusic can induce strong subjective experience of emotions, but it is debated whether these responses engage the same neural circuits as emotions elicited by biologically significant events. We examined the functional neural basis of music-induced emotions in a large sample (n=102) of subjects who listened to emotionally engaging (happy, sad, fearful, and tender) pieces of instrumental music while their haemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). Ratings of the four categorical emotions and liking were used to predict haemodynamic responses in general linear model (GLM) analysis of the fMRI data. Multivariate pattern analysis (MVPA) was used to reveal discrete neural signatures of the four categories of music-induced emotions. To map neural circuits governing non-musical emotions, the subjects were scanned while viewing short emotionally evocative film clips. The GLM revealed that most emotions were associated with activity in the auditory, somatosensory and motor cortices, cingulate gyrus, insula, and precuneus. Fear and liking also engaged the amygdala. In contrast, the film clips strongly activated limbic and cortical regions implicated in emotional processing. MVPA revealed that activity in the auditory cortex in particular as well as in the primary motor cortices reliably discriminated the emotion categories. Our results indicate that different music-induced basic emotions have distinct representations in regions supporting auditory processing, motor control, somatosensation and interoception but do not strongly rely on limbic and medial prefrontal regions critical for emotions with survival value.


2020 ◽  
Vol 32 (7) ◽  
pp. 1369-1380 ◽  
Author(s):  
Nicola Binetti ◽  
Alessandro Tomassini ◽  
Karl Friston ◽  
Sven Bestmann

Timing emerges from a hierarchy of computations ranging from early encoding of physical duration (time sensation) to abstract time representations (time perception) suitable for storage and decisional processes. However, the neural basis of the perceptual experience of time remains elusive. To address this, we dissociate brain activity uniquely related to lower-level sensory and higher-order perceptual timing operations, using event-related fMRI. Participants compared subsecond (500 msec) sinusoidal gratings drifting with constant velocity (standard) against two probe stimuli: (1) control gratings drifting at constant velocity or (2) accelerating gratings, which induced illusory shortening of time. We tested two probe intervals: a 500-msec duration (Short) and a longer duration required for an accelerating probe to be perceived as long as the standard (Long—individually determined). On each trial, participants classified the probe as shorter or longer than the standard. This allowed for comparison of trials with an “Objective” (physical) or “Subjective” (perceived) difference in duration, based on participant classifications. Objective duration revealed responses in bilateral early extrastriate areas, extending to higher visual areas in the fusiform gyrus (at more lenient thresholds). By contrast, Subjective duration was reflected by distributed responses in a cortical/subcortical areas. This comprised the left superior frontal gyrus and the left cerebellum, and a wider set of common timing areas including the BG, parietal cortex, and posterior cingulate cortex. These results suggest two functionally independent timing stages: early extraction of duration information in sensory cortices and Subjective experience of duration in a higher-order cortical–subcortical timing areas.


2021 ◽  
Author(s):  
Isaac David

Emotion and its perception are fundamental psychological faculties for the survival of animals and social interaction. This is recognized by the emergence of whole areas of neuroscience devoted to understanding its neural basis. Although the basic components of such emotional system have been identified, the segregation of the milieu of affective experiences into different patterns of brain signals remains poorly understood. Recent functional imaging studies have implicated simultaneous distributed activity as a better correlate of emotional state than its univariate counterpart; however, those attempts have still restricted themselves to regions of interest and severely-filtered data. In this work we tested whether the visual perception of three basic emotions can be decoded from full brain activity using multivariate pattern classification, while keeping localizationist and encoding assumptions at a minimum. Beyond stimuli prediction, we also provide proof of-concept anatomical mapping and discovery of relevant structures.To this end, we ran a face perception experiment on a sample of 16 neurotypical participants while recording their brain activity using fMRI. Per-subject SVM classifiers were trained on the fMRI data, so that they could recognize the emotion class brains were presented with. Results were cross-validated and compared against performance by chance using resampling techniques; and the whole of our reproducible pipeline was further validated using more trivial contrasts embedded within the main emotional task. Thorough assessment of behavioral data points towards the validity of our task.Results show a robust and distributed representation of (perceived) happiness in humans, but not of negative-valence anger and sadness; contrary to the more optimistic (though less diligent) existing studies. Overall, our approach proved more sensitive and anatomically specific than the classical mass-univariate analysis, amidst high-dimensionality concerns. Group inference of SVM parameters suggests the defining information-bearing pattern emanates from known structures in the ventral visual pathway and emotion-related areas. Namely: the primary visual cortex (V1) and surroundings, the middle collateral sulcus and parahippocampal gyrus (mCS, mPHG), the amygdala, the medial prefrontal cortex (mPFC) and the anterior cerebellum around the vermis; all of them in bilateral fashion. Our work paves the way for further multivariate studies to provide a complementary picture of emotions (and other brain functions), according to its macroscale dynamics.


2019 ◽  
Vol 30 (5) ◽  
pp. 2766-2776 ◽  
Author(s):  
John P Powers ◽  
John L Graner ◽  
Kevin S LaBar

Abstract Distancing is an effective tactic for emotion regulation, which can take several forms depending on the type(s) of psychological distance being manipulated to modify affect. We recently proposed a neurocognitive model of emotional distancing, but it is unknown how its specific forms are instantiated in the brain. Here, we presented healthy young adults (N = 34) with aversive pictures during functional magnetic resonance imaging to directly compare behavioral performance and brain activity across spatial, temporal, and objective forms of distancing. We found emotion regulation performance to be largely comparable across these forms. A conjunction analysis of activity associated with these forms yielded a high degree of overlap, encompassing regions of the default mode and frontoparietal networks as predicted by our model. A multivariate pattern classification further revealed distributed patches of posterior cortical activation that discriminated each form from one another. These findings not only confirm aspects of our overarching model but also elucidate a novel role for cortical regions in and around the parietal lobe in selectively supporting spatial, temporal, and social cognitive processes to distance oneself from an emotional encounter. These regions may provide new targets for brain-based interventions for emotion dysregulation.


2021 ◽  
Author(s):  
Clara Alameda ◽  
Daniel Sanabria ◽  
Luis F. Ciria

Flow state is a subjective experience that people report when task performance is experienced as automatic, intrinsically rewarding, optimal and effortless. While this intriguing phenomenon is the subject of a plethora of behavioural studies, only recently researchers have started to look at its neural correlates. Here, we summarize the main findings of a total of 22 studies aimed at inducing or assessing the experience of flow and the concomitant brain activity patterns. In light of the results, we conclude that the current available evidence is sparse and inconclusive, which limits any theoretical debate. We also outline major limitations of this literature and highlight several aspects regarding experimental design and flow measurements that may provide useful avenues for future studies on this topic.


2021 ◽  
Vol 15 ◽  
Author(s):  
Alexander Maÿe ◽  
Tiezhi Wang ◽  
Andreas K. Engel

Hyper-brain studies analyze the brain activity of two or more individuals during some form of interaction. Several studies found signs of inter-subject brain activity coordination, such as power and phase synchronization or information flow. This hyper-brain coordination is frequently studied in paradigms which induce rhythms or even synchronization, e.g., by mirroring movements, turn-based activity in card or economic games, or joint music making. It is therefore interesting to figure out in how far coordinated brain activity may be induced by a rhythmicity in the task and/or the sensory feedback that the partners receive. We therefore studied the EEG brain activity of dyads in a task that required the smooth pursuit of a target and did not involve any extrinsic rhythms. Partners controlled orthogonal axes of the two-dimensional motion of an object that had to be kept on the target. Using several methods for analyzing hyper-brain coupling, we could not detect signs of coordinated brain activity. However, we found several brain regions in which the frequency-specific activity significantly correlated with the objective task performance, the subjective experience thereof, and of the collaboration. Activity in these regions has been linked to motor control, sensorimotor integration, executive control and emotional processing. Our results suggest that neural correlates of intersubjectivity encompass large parts of brain areas that are considered to be involved in sensorimotor control without necessarily coordinating their activity across agents.


2022 ◽  
Author(s):  
Vesa Juhani Putkinen ◽  
Sanaz Nazari-Farsani ◽  
Tomi Karjalainen ◽  
Severi Santavirta ◽  
Matthew Hudson ◽  
...  

Sex differences in brain activity evoked by sexual stimuli remain elusive despite robust evidence for stronger enjoyment of and interest towards sexual stimuli in men than in women. To test whether visual sexual stimuli evoke different brain activity patterns in men and women, we measured haemodynamic brain activity induced by visual sexual stimuli in two experiments in 91 subjects (46 males). In one experiment, the subjects viewed sexual and non-sexual film clips and dynamic annotations for nudity in the clips was used to predict their hemodynamic activity. In the second experiment, the subjects viewed sexual and non-sexual pictures in an event-related design. Males showed stronger activation than females in the visual and prefrontal cortices and dorsal attention network in both experiments. Furthermore, using multivariate pattern classification we could accurately predict the sex of the subject on the basis of the brain activity elicited by the sexual stimuli. The classification generalized across the experiments indicating that the sex differences were consistent across the experiments. Eye tracking data obtained from an independent sample of subjects (N = 110) showed that men looked longer than women at the chest area of the nude female actors in the film clips. These results indicate that visual sexual stimuli evoke discernible brain activity patterns in men and women which may reflect stronger attentional engagement with sexual stimuli in men than women.


2015 ◽  
Vol 29 (4) ◽  
pp. 135-146 ◽  
Author(s):  
Miroslaw Wyczesany ◽  
Szczepan J. Grzybowski ◽  
Jan Kaiser

Abstract. In the study, the neural basis of emotional reactivity was investigated. Reactivity was operationalized as the impact of emotional pictures on the self-reported ongoing affective state. It was used to divide the subjects into high- and low-responders groups. Independent sources of brain activity were identified, localized with the DIPFIT method, and clustered across subjects to analyse the visual evoked potentials to affective pictures. Four of the identified clusters revealed effects of reactivity. The earliest two started about 120 ms from the stimulus onset and were located in the occipital lobe and the right temporoparietal junction. Another two with a latency of 200 ms were found in the orbitofrontal and the right dorsolateral cortices. Additionally, differences in pre-stimulus alpha level over the visual cortex were observed between the groups. The attentional modulation of perceptual processes is proposed as an early source of emotional reactivity, which forms an automatic mechanism of affective control. The role of top-down processes in affective appraisal and, finally, the experience of ongoing emotional states is also discussed.


Author(s):  
Mattson Ogg ◽  
L. Robert Slevc

Music and language are uniquely human forms of communication. What neural structures facilitate these abilities? This chapter conducts a review of music and language processing that follows these acoustic signals as they ascend the auditory pathway from the brainstem to auditory cortex and on to more specialized cortical regions. Acoustic, neural, and cognitive mechanisms are identified where processing demands from both domains might overlap, with an eye to examples of experience-dependent cortical plasticity, which are taken as strong evidence for common neural substrates. Following an introduction describing how understanding musical processing informs linguistic or auditory processing more generally, findings regarding the major components (and parallels) of music and language research are reviewed: pitch perception, syntax and harmonic structural processing, semantics, timbre and speaker identification, attending in auditory scenes, and rhythm. Overall, the strongest evidence that currently exists for neural overlap (and cross-domain, experience-dependent plasticity) is in the brainstem, followed by auditory cortex, with evidence and the potential for overlap becoming less apparent as the mechanisms involved in music and speech perception become more specialized and distinct at higher levels of processing.


2012 ◽  
Vol 24 (9) ◽  
pp. 1867-1883 ◽  
Author(s):  
Bradley R. Buchsbaum ◽  
Sabrina Lemire-Rodger ◽  
Candice Fang ◽  
Hervé Abdi

When we have a rich and vivid memory for a past experience, it often feels like we are transported back in time to witness once again this event. Indeed, a perfect memory would exactly mimic the experiential quality of direct sensory perception. We used fMRI and multivoxel pattern analysis to map and quantify the similarity between patterns of activation evoked by direct perception of a diverse set of short video clips and the vivid remembering, with closed eyes, of these clips. We found that the patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception. Using whole-brain patterns of activation evoked by perception of the videos, we were able to accurately classify brain patterns that were elicited when participants tried to vividly recall those same videos. A discriminant analysis of the activation patterns associated with each video revealed a high degree (explaining over 80% of the variance) of shared representational similarity between perception and memory. These results show that complex, multifeatured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the stimulus.


Sign in / Sign up

Export Citation Format

Share Document