scholarly journals Estudio de los correlatos neurales de la percepción emocional por análisis de patrones en multitud de voxeles

2021 ◽  
Author(s):  
Isaac David

Emotion and its perception are fundamental psychological faculties for the survival of animals and social interaction. This is recognized by the emergence of whole areas of neuroscience devoted to understanding its neural basis. Although the basic components of such emotional system have been identified, the segregation of the milieu of affective experiences into different patterns of brain signals remains poorly understood. Recent functional imaging studies have implicated simultaneous distributed activity as a better correlate of emotional state than its univariate counterpart; however, those attempts have still restricted themselves to regions of interest and severely-filtered data. In this work we tested whether the visual perception of three basic emotions can be decoded from full brain activity using multivariate pattern classification, while keeping localizationist and encoding assumptions at a minimum. Beyond stimuli prediction, we also provide proof of-concept anatomical mapping and discovery of relevant structures.To this end, we ran a face perception experiment on a sample of 16 neurotypical participants while recording their brain activity using fMRI. Per-subject SVM classifiers were trained on the fMRI data, so that they could recognize the emotion class brains were presented with. Results were cross-validated and compared against performance by chance using resampling techniques; and the whole of our reproducible pipeline was further validated using more trivial contrasts embedded within the main emotional task. Thorough assessment of behavioral data points towards the validity of our task.Results show a robust and distributed representation of (perceived) happiness in humans, but not of negative-valence anger and sadness; contrary to the more optimistic (though less diligent) existing studies. Overall, our approach proved more sensitive and anatomically specific than the classical mass-univariate analysis, amidst high-dimensionality concerns. Group inference of SVM parameters suggests the defining information-bearing pattern emanates from known structures in the ventral visual pathway and emotion-related areas. Namely: the primary visual cortex (V1) and surroundings, the middle collateral sulcus and parahippocampal gyrus (mCS, mPHG), the amygdala, the medial prefrontal cortex (mPFC) and the anterior cerebellum around the vermis; all of them in bilateral fashion. Our work paves the way for further multivariate studies to provide a complementary picture of emotions (and other brain functions), according to its macroscale dynamics.

2020 ◽  
Author(s):  
Vesa Putkinen ◽  
Sanaz Nazari-Farsani ◽  
Kerttu Seppälä ◽  
Tomi Karjalainen ◽  
Lihua Sun ◽  
...  

Abstract Music can induce strong subjective experience of emotions, but it is debated whether these responses engage the same neural circuits as emotions elicited by biologically significant events. We examined the functional neural basis of music-induced emotions in a large sample (n = 102) of subjects who listened to emotionally engaging (happy, sad, fearful, and tender) pieces of instrumental music while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). Ratings of the four categorical emotions and liking were used to predict hemodynamic responses in general linear model (GLM) analysis of the fMRI data. Multivariate pattern analysis (MVPA) was used to reveal discrete neural signatures of the four categories of music-induced emotions. To map neural circuits governing non-musical emotions, the subjects were scanned while viewing short emotionally evocative film clips. The GLM revealed that most emotions were associated with activity in the auditory, somatosensory, and motor cortices, cingulate gyrus, insula, and precuneus. Fear and liking also engaged the amygdala. In contrast, the film clips strongly activated limbic and cortical regions implicated in emotional processing. MVPA revealed that activity in the auditory cortex and primary motor cortices reliably discriminated the emotion categories. Our results indicate that different music-induced basic emotions have distinct representations in regions supporting auditory processing, motor control, and interoception but do not strongly rely on limbic and medial prefrontal regions critical for emotions with survival value.


2021 ◽  
Author(s):  
Isaac David ◽  
Fernando A Barrios

It's now common to approach questions about information representation in the brain using multivariate statistics and machine learning methods. What is less recognized is that, in the process, the capacity for data-driven discovery and functional localization has diminished. This is because multivariate pattern analysis (MVPA) studies tend to restrict themselves to regions of interest and severely-filtered data, and sound parameter mapping inference is lacking. Here, reproducible evidence is presented that a high-dimensional, brain-wide multivariate linear method can better detect and characterize the occurrence of visual and socio-affective states in a task-oriented functional magnetic resonance imaging (fMRI) experiment; in comparison to the classical localizationist correlation analysis. Classification models for a group of human participants and existing rigorous cluster inference methods are used to construct group anatomical-statistical parametric maps, which correspond to the most likely neural correlates of each psychological state. This led to the discovery of a multidimensional pattern of brain activity which reliably encodes for the perception of happiness in the visual cortex, cerebellum and some limbic areas. We failed to find similar evidence for sadness and anger. Anatomical consistency of discriminating features across subjects and contrasts despite of the high number of dimensions, as well as agreement with the wider literature, suggest MVPA is a viable tool for full-brain functional neuroanatomical mapping and not just prediction of psychological states. The present work paves the way for future functional brain imaging studies to provide a complementary picture of brain functions (such as emotion), according to their macroscale dynamics.


2020 ◽  
Author(s):  
Vesa Putkinen ◽  
Sanaz Nazari-Farsani ◽  
Kerttu Seppälä ◽  
Tomi Karjalainen ◽  
Lihua Sun ◽  
...  

AbstractMusic can induce strong subjective experience of emotions, but it is debated whether these responses engage the same neural circuits as emotions elicited by biologically significant events. We examined the functional neural basis of music-induced emotions in a large sample (n=102) of subjects who listened to emotionally engaging (happy, sad, fearful, and tender) pieces of instrumental music while their haemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). Ratings of the four categorical emotions and liking were used to predict haemodynamic responses in general linear model (GLM) analysis of the fMRI data. Multivariate pattern analysis (MVPA) was used to reveal discrete neural signatures of the four categories of music-induced emotions. To map neural circuits governing non-musical emotions, the subjects were scanned while viewing short emotionally evocative film clips. The GLM revealed that most emotions were associated with activity in the auditory, somatosensory and motor cortices, cingulate gyrus, insula, and precuneus. Fear and liking also engaged the amygdala. In contrast, the film clips strongly activated limbic and cortical regions implicated in emotional processing. MVPA revealed that activity in the auditory cortex in particular as well as in the primary motor cortices reliably discriminated the emotion categories. Our results indicate that different music-induced basic emotions have distinct representations in regions supporting auditory processing, motor control, somatosensation and interoception but do not strongly rely on limbic and medial prefrontal regions critical for emotions with survival value.


2015 ◽  
Vol 29 (4) ◽  
pp. 135-146 ◽  
Author(s):  
Miroslaw Wyczesany ◽  
Szczepan J. Grzybowski ◽  
Jan Kaiser

Abstract. In the study, the neural basis of emotional reactivity was investigated. Reactivity was operationalized as the impact of emotional pictures on the self-reported ongoing affective state. It was used to divide the subjects into high- and low-responders groups. Independent sources of brain activity were identified, localized with the DIPFIT method, and clustered across subjects to analyse the visual evoked potentials to affective pictures. Four of the identified clusters revealed effects of reactivity. The earliest two started about 120 ms from the stimulus onset and were located in the occipital lobe and the right temporoparietal junction. Another two with a latency of 200 ms were found in the orbitofrontal and the right dorsolateral cortices. Additionally, differences in pre-stimulus alpha level over the visual cortex were observed between the groups. The attentional modulation of perceptual processes is proposed as an early source of emotional reactivity, which forms an automatic mechanism of affective control. The role of top-down processes in affective appraisal and, finally, the experience of ongoing emotional states is also discussed.


2012 ◽  
Vol 24 (9) ◽  
pp. 1867-1883 ◽  
Author(s):  
Bradley R. Buchsbaum ◽  
Sabrina Lemire-Rodger ◽  
Candice Fang ◽  
Hervé Abdi

When we have a rich and vivid memory for a past experience, it often feels like we are transported back in time to witness once again this event. Indeed, a perfect memory would exactly mimic the experiential quality of direct sensory perception. We used fMRI and multivoxel pattern analysis to map and quantify the similarity between patterns of activation evoked by direct perception of a diverse set of short video clips and the vivid remembering, with closed eyes, of these clips. We found that the patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception. Using whole-brain patterns of activation evoked by perception of the videos, we were able to accurately classify brain patterns that were elicited when participants tried to vividly recall those same videos. A discriminant analysis of the activation patterns associated with each video revealed a high degree (explaining over 80% of the variance) of shared representational similarity between perception and memory. These results show that complex, multifeatured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the stimulus.


2021 ◽  
Vol 11 (2) ◽  
pp. 196
Author(s):  
Sébastien Laurent ◽  
Laurence Paire-Ficout ◽  
Jean-Michel Boucheix ◽  
Stéphane Argon ◽  
Antonio Hidalgo-Muñoz

The question of the possible impact of deafness on temporal processing remains unanswered. Different findings, based on behavioral measures, show contradictory results. The goal of the present study is to analyze the brain activity underlying time estimation by using functional near infrared spectroscopy (fNIRS) techniques, which allow examination of the frontal, central and occipital cortical areas. A total of 37 participants (19 deaf) were recruited. The experimental task involved processing a road scene to determine whether the driver had time to safely execute a driving task, such as overtaking. The road scenes were presented in animated format, or in sequences of 3 static images showing the beginning, mid-point, and end of a situation. The latter presentation required a clocking mechanism to estimate the time between the samples to evaluate vehicle speed. The results show greater frontal region activity in deaf people, which suggests that more cognitive effort is needed to process these scenes. The central region, which is involved in clocking according to several studies, is particularly activated by the static presentation in deaf people during the estimation of time lapses. Exploration of the occipital region yielded no conclusive results. Our results on the frontal and central regions encourage further study of the neural basis of time processing and its links with auditory capacity.


2003 ◽  
Vol 89 (5) ◽  
pp. 2516-2527 ◽  
Author(s):  
Laurent Petit ◽  
Michael S. Beauchamp

We used event-related fMRI to measure brain activity while subjects performed saccadic eye, head, and gaze movements to visually presented targets. Two distinct patterns of response were observed. One set of areas was equally active during eye, head, and gaze movements and consisted of the superior and inferior subdivisions of the frontal eye fields, the supplementary eye field, the intraparietal sulcus, the precuneus, area MT in the lateral occipital sulcus and subcortically in basal ganglia, thalamus, and the superior colliculus. These areas have been previously observed in functional imaging studies of human eye movements, suggesting that a common set of brain areas subserves both oculomotor and head movement control in humans, consistent with data from single-unit recording and microstimulation studies in nonhuman primates that have described overlapping eye- and head-movement representations in oculomotor control areas. A second set of areas was active during head and gaze movements but not during eye movements. This set of areas included the posterior part of the planum temporale and the cortex at the temporoparietal junction, known as the parieto-insular vestibular cortex (PIVC). Activity in PIVC has been observed during imaging studies of invasive vestibular stimulation, and we confirm its role in processing the vestibular cues accompanying natural head movements. Our findings demonstrate that fMRI can be used to study the neural basis of head movements and show that areas that control eye movements also control head movements. In addition, we provide the first evidence for brain activity associated with vestibular input produced by natural head movements as opposed to invasive caloric or galvanic vestibular stimulation.


2012 ◽  
Vol 17 (1) ◽  
pp. 5-26
Author(s):  
Hans Goller

Neuroscientists keep telling us that the brain produces consciousness and consciousness does not survive brain death because it ceases when brain activity ceases. Research findings on near-death-experiences during cardiac arrest contradict this widely held conviction. They raise perplexing questions with regard to our current understanding of the relationship between consciousness and brain functions. Reports on veridical perceptions during out-of-body experiences suggest that consciousness may be experienced independently of a functioning brain and that self-consciousness may continue even after the termination of brain activity. Data on studies of near-death-experiences could be an incentive to develop alternative theories of the body-mind relation as seen in contemporary neuroscience.


2015 ◽  
Vol 2015 ◽  
pp. 1-9 ◽  
Author(s):  
Qi Liu ◽  
Peihai Zhang ◽  
Junjie Pan ◽  
Zhengjie Li ◽  
Jixin Liu ◽  
...  

Background.Pattern differentiation is the foundation of traditional Chinese medicine (TCM) treatment for erectile dysfunction (ED). This study aims to investigate the differences in cerebral activity in ED patients with different TCM patterns.Methods.27 psychogenic ED patients and 27 healthy subjects (HS) were enrolled in this study. Each participant underwent an fMRI scan in resting state. The fractional amplitude of low-frequency fluctuation (fALFF) was used to detect the brain activity changes in ED patients with different patterns.Results.Compared to HS, ED patients showed an increased cerebral activity in bilateral cerebellum, insula, globus pallidus, parahippocampal gyrus, orbitofrontal cortex (OFC), and middle cingulate cortex (MCC). Compared to the patients with liver-qi stagnation and spleen deficiency pattern (LSSDP), the patients with kidney-yang deficiency pattern (KDP) showed an increased activity in bilateral brainstem, cerebellum, hippocampus, and the right insula, thalamus, MCC, and a decreased activity in bilateral putamen, medial frontal gyrus, temporal pole, and the right caudate nucleus, OFC, anterior cingulate cortex, and posterior cingulate cortex (P<0.005).Conclusions.The ED patients with different TCM patterns showed different brain activities. The differences in cerebral activity between LSSDP and KDP were mainly in the emotion-related regions, including prefrontal cortex and cingulated cortex.


2017 ◽  
Author(s):  
Sarah L. Dziura ◽  
James C. Thompson

AbstractSocial functioning involves learning about the social networks in which we live and interact; knowing not just our friends, but also who is friends with our friends. Here we utilized a novel incidental learning paradigm and representational similarity analysis (RSA), a functional MRI multivariate pattern analysis technique, to examine the relationship between learning social networks and the brain's response to the faces within the networks. We found that accuracy of learning face pair relationships through observation is correlated with neural similarity patterns to those pairs in the left temporoparietal junction (TPJ), the left fusiform gyrus, and the subcallosal ventromedial prefrontal cortex (vmPFC), all areas previously implicated in social cognition. This model was also significant in portions of the cerebellum and thalamus. These results show that the similarity of neural patterns represent how accurately we understand the closeness of any two faces within a network, regardless of their true relationship. Our findings indicate that these areas of the brain not only process knowledge and understanding of others, but also support learning relations between individuals in groups.Significance StatementKnowledge of the relationships between people is an important skill that helps us interact in a highly social world. While much is known about how the human brain represents the identity, goals, and intentions of others, less is known about how we represent knowledge about social relationships between others. In this study, we used functional neuroimaging to demonstrate that patterns in human brain activity represent memory for recently learned social connections.


Sign in / Sign up

Export Citation Format

Share Document