scholarly journals Circular Causality of Emotions in Moving Pictures

2021 ◽  
Vol 20 (1) ◽  
pp. 86-110
Author(s):  
Mircea Valeriu Deaca

Abstract In the framework of predictive coding, as explained by Giovanni Pezzulo in his article Why do you fear the bogeyman? An embodied predictive coding model of perceptual inference (2014), humans construct instances of emotions by a double arrow of explanation of stimuli. Top-down cognitive models explain in a predictive fashion the emotional value of stimuli. At the same time, feelings and emotions depend on the perception of internal changes in the body. When confronted with uncertain auditory and visual information, a multimodal internal state assigns more weight to interoceptive information (rather than auditory and visual information) like visceral and autonomic states as hunger or thirst (motivational conditions). In short, an emotional mood can constrain the construction of a particular instance of emotion. This observation suggests that the dynamics of generative processes of Bayesian inference contain a mechanism of bidirectional link between perceptual and cognitive inference and feelings and emotions. In other words, “subjective feeling states and emotions influence perceptual and cognitive inference, which in turn produce new subjective feeling states and emotions” as a self-fulfilling prophecy (Pezzulo 2014, 908). This article focuses on the short introductory scene from Steven Spielberg’s Jaws (1975), claiming that the construction / emergence of the fear and sadness emotions are created out of the circular causal coupling instantiated between cinematic bottom-up mood cues and top-down cognitive explanations.

2019 ◽  
Author(s):  
Yuru Song ◽  
Mingchen Yao ◽  
Helen Kemprecos ◽  
Áine Byrne ◽  
Zhengdong Xiao ◽  
...  

AbstractPain is a complex, multidimensional experience that involves dynamic interactions between sensory-discriminative and affective-emotional processes. Pain experiences have a high degree of variability depending on their context and prior anticipation. Viewing pain perception as a perceptual inference problem, we use a predictive coding paradigm to characterize both evoked and spontaneous pain. We record the local field potentials (LFPs) from the primary somatosensory cortex (S1) and the anterior cingulate cortex (ACC) of freely behaving rats—two regions known to encode the sensory-discriminative and affective-emotional aspects of pain, respectively. We further propose a framework of predictive coding to investigate the temporal coordination of oscillatory activity between the S1 and ACC. Specifically, we develop a high-level, empirical and phenomenological model to describe the macroscopic dynamics of bottom-up and top-down activity. Supported by recent experimental data, we also develop a mechanistic mean-field model to describe the mesoscopic population neuronal dynamics in the S1 and ACC populations, in both naive and chronic pain-treated animals. Our proposed predictive coding models not only replicate important experimental findings, but also provide new mechanistic insight into the uncertainty of expectation, placebo or nocebo effect, and chronic pain.Author SummaryPain perception in the mammalian brain is encoded through multiple brain circuits. The experience of pain is often associated with brain rhythms or neuronal oscillations at different frequencies. Understanding the temporal coordination of neural oscillatory activity from different brain regions is important for dissecting pain circuit mechanisms and revealing differences between distinct pain conditions. Predictive coding is a general computational framework to understand perceptual inference by integrating bottom-up sensory information and top-down expectation. Supported by experimental data, we propose a predictive coding framework for pain perception, and develop empirical and biologically-constrained computational models to characterize oscillatory dynamics of neuronal populations from two cortical circuits—one for the sensory-discriminative experience and the other for affective-emotional experience, and further characterize their temporal coordination under various pain conditions. Our computational study of biologically-constrained neuronal population model reveals important mechanistic insight on pain perception, placebo analgesia, and chronic pain.


2018 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M Jenkinson ◽  
Aikaterini Fotopoulou

AbstractMultisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant’s hidden hand and a visible rubber hand creates illusory bodily ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision or salience of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossed-over study (N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased the embodied version of the SWI (quantified as weight estimation error). These findings suggest that oxytocin might modulate processes of visuo-tactile multisensory integration by increasing the precision of top-down signals against bottom-up sensory input.


2019 ◽  
Vol 31 (4) ◽  
pp. 592-606 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M. Jenkinson ◽  
Aikaterini Fotopoulou

Multisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size–weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossover study ( n = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile multisensory integration by increasing the precision of top–down signals against bottom–up sensory input.


2019 ◽  
Vol 116 (28) ◽  
pp. 13897-13902 ◽  
Author(s):  
Pierpaolo Iodice ◽  
Giuseppina Porciello ◽  
Ilaria Bufalari ◽  
Laura Barca ◽  
Giovanni Pezzulo

Interoception, or the sense of the internal state of the body, is key to the adaptive regulation of our physiological needs. Recent theories contextualize interception within a predictive coding framework, according to which the brain both estimates and controls homeostatic and physiological variables, such as hunger, thirst, and effort levels, by orchestrating sensory, proprioceptive, and interoceptive signals from inside the body. This framework suggests that providing false interoceptive feedback may induce misperceptions of physiological variables, or “interoceptive illusions.” Here we ask whether it is possible to produce an illusory perception of effort by giving participants false acoustic feedback about their heart-rate frequency during an effortful cycling task. We found that participants reported higher levels of perceived effort when their heart-rate feedback was faster compared with when they cycled at the same level of intensity with a veridical feedback. However, participants did not report lower effort when their heart-rate feedback was slower, which is reassuring, given that failing to notice one’s own effort is dangerous in ecologically valid conditions. Our results demonstrate that false cardiac feedback can produce interoceptive illusions. Furthermore, our results pave the way for novel experimental manipulations that use illusions to study interoceptive processing.


2018 ◽  
Author(s):  
Josipa Alilović ◽  
Bart Timmermans ◽  
Leon C. Reteig ◽  
Simon van Gaal ◽  
Heleen A. Slagter

AbstractPredictive coding models propose that predictions (stimulus likelihood) reduce sensory signals as early as primary visual cortex (V1), and that attention (stimulus relevance) can modulate these effects. Indeed, both prediction and attention have been shown to modulate V1 activity, albeit with fMRI, which has low temporal resolution. This leaves it unclear whether these effects reflect a modulation of the first feedforward sweep of visual information processing and/or later, feedback-related activity. In two experiments, we used EEG and orthogonally manipulated spatial predictions and attention to address this issue. Although clear top-down biases were found, as reflected in pre-stimulus alpha-band activity, we found no evidence for top-down effects on the earliest visual cortical processing stage (<80ms post-stimulus), as indexed by the amplitude of the C1 ERP component and multivariate pattern analyses. These findings indicate that initial visual afferent activity may be impenetrable to top-down influences by spatial prediction and attention.


1995 ◽  
Vol 7 (2) ◽  
pp. 21-23 ◽  
Author(s):  
S. Daan

The analysis of motivational systems underlying temporal organisation in animal behaviour has relied primarily on two conceptual functional frameworks: Homeostasis and biological clocks. Homeostasis is one of the most general and influential concepts in physiology. Walter Cannon introduced homeostasis as a universal regulatory principle which animals employ to maintain constancy of their ‘internal milieu’ in the face of challenges and perturbations from the external environment. Cannon spoke of “The Wisdom of the Body”, the collective of responses designed to defend the ideal internal state against those perturbations.


2013 ◽  
Vol 36 (3) ◽  
pp. 227-228 ◽  
Author(s):  
Anil K. Seth ◽  
Hugo D. Critchley

AbstractThe Bayesian brain hypothesis provides an attractive unifying framework for perception, cognition, and action. We argue that the framework can also usefully integrate interoception, the sense of the internal physiological condition of the body. Our model of “interoceptive predictive coding” entails a new view of emotion as interoceptive inference and may account for a range of psychiatric disorders of selfhood.


2018 ◽  
Author(s):  
Simona Monaco ◽  
Giulia Malfatti ◽  
Alessandro Zendron ◽  
Elisa Pellencin ◽  
Luca Turella

AbstractPredictions of upcoming movements are based on several types of neural signals that span the visual, somatosensory, motor and cognitive system. Thus far, pre-movement signals have been investigated while participants viewed the object to be acted upon. Here, we studied the contribution of information other than vision to the classification of preparatory signals for action, even in absence of online visual information. We used functional magnetic resonance imaging (fMRI) and multivoxel pattern analysis (MVPA) to test whether the neural signals evoked by visual, memory-based and somato-motor information can be reliably used to predict upcoming actions in areas of the dorsal and ventral visual stream during the preparatory phase preceding the action, while participants were lying still. Nineteen human participants (nine women) performed one of two actions towards an object with their eyes open or closed. Despite the well-known role of ventral stream areas in visual recognition tasks and the specialization of dorsal stream areas in somato-motor processes, we decoded action intention in areas of both streams based on visual, memory-based and somato-motor signals. Interestingly, we could reliably decode action intention in absence of visual information based on neural activity evoked when visual information was available, and vice-versa. Our results show a similar visual, memory and somato-motor representation of action planning in dorsal and ventral visual stream areas that allows predicting action intention across domains, regardless of the availability of visual information.


2018 ◽  
Author(s):  
Christian D. Márton ◽  
Makoto Fukushima ◽  
Corrie R. Camalier ◽  
Simon R. Schultz ◽  
Bruno B. Averbeck

AbstractPredictive coding is a theoretical framework that provides a functional interpretation of top-down and bottom up interactions in sensory processing. The theory has suggested that specific frequency bands relay bottom-up and top-down information (e.g. “γ up, β down”). But it remains unclear whether this notion generalizes to cross-frequency interactions. Furthermore, most of the evidence so far comes from visual pathways. Here we examined cross-frequency coupling across four sectors of the auditory hierarchy in the macaque. We computed two measures of cross-frequency coupling, phase-amplitude coupling (PAC) and amplitude-amplitude coupling (AAC). Our findings revealed distinct patterns for bottom-up and top-down information processing among cross-frequency interactions. Both top-down and bottom-up made prominent use of low frequencies: low-to-low frequency (θ, α, β) and low frequency-to-high γ couplings were predominant top-down, while low frequency-to-low γ couplings were predominant bottom-up. These patterns were largely preserved across coupling types (PAC and AAC) and across stimulus types (natural and synthetic auditory stimuli), suggesting they are a general feature of information processing in auditory cortex. Moreover, our findings showed that low-frequency PAC alternated between predominantly top-down or bottom-up over time. Altogether, this suggests sensory information need not be propagated along separate frequencies upwards and downwards. Rather, information can be unmixed by having low frequencies couple to distinct frequency ranges in the target region, and by alternating top-down and bottom-up processing over time.1SignificanceThe brain consists of highly interconnected cortical areas, yet the patterns in directional cortical communication are not fully understood, in particular with regards to interactions between different signal components across frequencies. We employed a a unified, computationally advantageous Granger-causal framework to examine bi-directional cross-frequency interactions across four sectors of the auditory cortical hierarchy in macaques. Our findings extend the view of cross-frequency interactions in auditory cortex, suggesting they also play a prominent role in top-down processing. Our findings also suggest information need not be propagated along separate channels up and down the cortical hierarchy, with important implications for theories of information processing in the brain such as predictive coding.


Sign in / Sign up

Export Citation Format

Share Document