scholarly journals The emulation theory of representation: Motor control, imagery, and perception

2004 ◽  
Vol 27 (3) ◽  
pp. 377-396 ◽  
Author(s):  
Rick Grush

The emulation theory of representation is developed and explored as a framework that can revealingly synthesize a wide variety of representational functions of the brain. The framework is based on constructs from control theory (forward models) and signal processing (Kalman filters). The idea is that in addition to simply engaging with the body and environment, the brain constructs neural circuits that act as models of the body and environment. During overt sensorimotor engagement, these models are driven by efference copies in parallel with the body and environment, in order to provide expectations of the sensory feedback, and to enhance and process sensory information. These models can also be run off-line in order to produce imagery, estimate outcomes of different actions, and evaluate and develop motor plans. The framework is initially developed within the context of motor control, where it has been shown that inner models running in parallel with the body can reduce the effects of feedback delay problems. The same mechanisms can account for motor imagery as the off-line driving of the emulator via efference copies. The framework is extended to account for visual imagery as the off-line driving of an emulator of the motor-visual loop. I also show how such systems can provide for amodal spatial imagery. Perception, including visual perception, results from such models being used to form expectations of, and to interpret, sensory input. I close by briefly outlining other cognitive functions that might also be synthesized within this framework, including reasoning, theory of mind phenomena, and language.

Author(s):  
Daya Gupta ◽  
Andreas Bahmer

Perception and motor interaction with physical surroundings can be analyzed by the changes in probability laws governing two possible outcomes of neuronal activity, namely the presence or absence of spikes (binary states). Perception and motor interaction with physical environment are accounted partly by the reduction in entropy within the probability distributions of binary states of neurons in distributed neural circuits, given the knowledge about the characteristics of stimuli in physical surroundings. This reduction in the total entropy of multiple pairs of circuits in networks, by an amount equal to the increase of mutual information among them, occurs as sensory information is processed successively from lower to higher cortical areas or between different areas at the same hierarchical level but belonging to different networks. The increase in mutual information is partly accounted by temporal coupling as well as synaptic connections as proposed by Bahmer and Gupta [1]. We propose that robust increases in mutual information, measuring the association between the characteristics of sensory inputs and neural circuits connectivity patterns, are partly responsible for perception and successful motor interactions with physical surroundings. It is also argued that perception from a sensory input is the result of networking of many circuits to a common circuit that primarily processes the given sensory input.


2021 ◽  
pp. 73-140
Author(s):  
Michael A. Arbib

Architects design spaces that offer perceptual cues, affordances, for our various effectivities. Lina Bo Bardi’s São Paulo Museum demonstrates how praxic and contemplative actions are interleaved—space is effective and affective. Navigation often extends beyond wayfinding to support ongoing behavior. Scripts set out the general rules for a particular kind of behavior, and may suggest places that a building must provide. Cognitive maps support wayfinding. Other maps in the brain represent sensory or motor patterns of activity. Juhani Pallasmaa’s reflections on The Thinking Hand lead into a view of how the brain mediates that thinking, modeling hand–eye coordination at two levels. The first coordinates perceptual and motor schemas. The body schema is an adaptable collage of perceptual and motor skills. The second coordinates the ventral “what” pathway that can support planning of actions, and the dorsal “how” pathway that links affordance-related details to motor control. A complementary challenge is understanding how schemas in the head relate to social schemas. Finally, the chapter compares the cognitive challenges in designing a building and in developing a computational brain model of cognitive processes.


2017 ◽  
Author(s):  
Olivia K Faull ◽  
Anja Hayen ◽  
Kyle T S Pattinson

AbstractBreathlessness debilitates millions of people with chronic illness. Mismatch between breathlessness severity and objective disease markers is common and poorly understood. Traditionally, sensory perception was conceptualised as a stimulus-response relationship, although this cannot explain how conditioned symptoms may occur in the absence of physiological signals from the lungs or airways. A Bayesian model is now proposed in which the brain generates sensations based on expectations learned from past experiences (priors), which are then checked against incoming afferent signals. In this model, psychological factors may act as moderators. They may either alter priors, or change the relative attention towards incoming sensory information, leading to more variable interpretation of an equivalent afferent input.In the present study we conducted a preliminary test of this model in a supplementary analysis of previously published data (Hayen 2017). We hypothesised that individual differences in psychological traits (anxiety, depression, anxiety sensitivity) would correlate with the variability of subjective evaluation of equivalent breathlessness challenges. To better understand the resulting inferential leap in the brain, we explored whether these behavioural measures correlated with activity in areas governing either prior generation or sensory afferent input.Behaviorally, anxiety sensitivity was found to positively correlate with each subject’s variability of intensity and unpleasantness during mild breathlessness, and with unpleasantness during strong breathlessness. In the brain, anxiety sensitivity was found to positively correlate with activity in the anterior insula during mild breathlessness, and negatively correlate with parietal sensorimotor areas during strong breathlessness.Our findings suggest that anxiety sensitivity may reduce the robustness of this Bayesian sensory perception system, increasing the variability of breathlessness perception and possibly susceptibility to symptom misinterpretation. These preliminary findings in healthy individuals demonstrate how differences in psychological function influence the way we experience bodily sensations, which might direct us towards better understanding of symptom mismatch in clinical populations.


2016 ◽  
Vol 121 (3) ◽  
pp. 760-770 ◽  
Author(s):  
C. N. Gambelli ◽  
D. Theisen ◽  
P. A. Willems ◽  
B. Schepens

Landing on the ground on one's feet implies that the energy gained during the fall be dissipated. The aim of this study is to assess human motor control of landing in different conditions of fall initiation, simulated gravity, and sensory neural input. Six participants performed drop landings using a trapdoor system and landings from self-initiated counter-movement jumps in microgravity conditions simulated in a weightlessness environment by different pull-down forces of 1-, 0.6-, 0.4-, and 0.2 g. External forces applied to the body, orientation of the lower limb segments, and muscular activity of 6 lower limb muscles were recorded synchronously. Our results show that 1) subjects are able to land and stabilize in all experimental conditions; 2) prelanding muscular activity is always present, emphasizing the capacity of the central nervous system to approximate the instant of touchdown; 3) the kinetics and muscular activity are adjusted to the amount of energy gained during the fall; 4) the control of landing seems less finely controlled in drop landings as suggested by higher impact forces and loading rates, plus lower mechanical work done during landing for a given amount of energy to be dissipated. In conclusion, humans seem able to adapt the control of landing according to the amount of energy to be dissipated in an environment where sensory information is altered, even under conditions of non-self-initiated falls.


Author(s):  
Erik Böhm ◽  
Daniela Brunert ◽  
Markus Rothermel

AbstractBasal forebrain modulation of central circuits is associated with active sensation, attention and learning. While cholinergic modulations have been studied extensively the effect of non-cholinergic basal forebrain subpopulations on sensory processing remains largely unclear. Here, we directly compare optogenetic manipulation effects of two major basal forebrain subpopulations on principal neuron activity in an early sensory processing area, i.e. mitral/tufted cells (MTCs) in the olfactory bulb. In contrast to cholinergic projections, which consistently increased MTC firing, activation of GABAergic fibers from basal forebrain to the olfactory bulb lead to differential modulation effects: while spontaneous MTC activity is mainly inhibited, odor evoked firing is predominantly enhanced. Moreover, sniff triggered averages revealed an enhancement of maximal sniff evoked firing amplitude and an inhibition of firing rates outside the maximal sniff phase. These findings demonstrate that GABAergic neuromodulation affects MTC firing in a bimodal, sensory-input dependent way, suggesting that GABAergic basal forebrain modulation could be an important factor in attention mediated filtering of sensory information to the brain.


2018 ◽  
Vol 29 (4) ◽  
pp. 496-503 ◽  
Author(s):  
Erika H. Siegel ◽  
Jolie B. Wormwood ◽  
Karen S. Quigley ◽  
Lisa Feldman Barrett

Affective realism, the phenomenon whereby affect is integrated into an individual’s experience of the world, is a normal consequence of how the brain processes sensory information from the external world in the context of sensations from the body. In the present investigation, we provided compelling empirical evidence that affective realism involves changes in visual perception (i.e., affect changes how participants see neutral stimuli). In two studies, we used an interocular suppression technique, continuous flash suppression, to present affective images outside of participants’ conscious awareness. We demonstrated that seen neutral faces are perceived as more smiling when paired with unseen affectively positive stimuli. Study 2 also demonstrated that seen neutral faces are perceived as more scowling when paired with unseen affectively negative stimuli. These findings have implications for real-world situations and challenge beliefs that affect is a distinct psychological phenomenon that can be separated from cognition and perception.


1992 ◽  
Vol 2 (4) ◽  
pp. 307-322
Author(s):  
James R. Lackner

Human sensory-motor control and orientation involve the correlation of sensory information from many modalities with motor information about ongoing patterns of voluntary and reflexive activation of the body musculature. The vestibular system represents only one of the acceleration-sensitive receptor systems of the body conveying spatial information. Touch- and pressure-dependent receptors, somatosensory and interoceptive, as well as proprioceptive receptors contribute, along with visual and auditory signals specifying relative motion between self and surround. Control of body movement and orientation is dynamically adapted to the 1G force background of Earth. Exposure to non-1G environments such as in space travel produces a variety of sensory-motor disturbances, and often motion sickness, until adaptation is achieved. Exposure to virtual environments in which body movements are not accompanied by normal patterns of inertial and sensory feedback can also lead to control errors and elicit motion sickness.


2020 ◽  
Vol 117 (13) ◽  
pp. 7510-7515 ◽  
Author(s):  
Tessel Blom ◽  
Daniel Feuerriegel ◽  
Philippa Johnson ◽  
Stefan Bode ◽  
Hinze Hogendoorn

The transmission of sensory information through the visual system takes time. As a result of these delays, the visual information available to the brain always lags behind the timing of events in the present moment. Compensating for these delays is crucial for functioning within dynamic environments, since interacting with a moving object (e.g., catching a ball) requires real-time localization of the object. One way the brain might achieve this is via prediction of anticipated events. Using time-resolved decoding of electroencephalographic (EEG) data, we demonstrate that the visual system represents the anticipated future position of a moving object, showing that predictive mechanisms activate the same neural representations as afferent sensory input. Importantly, this activation is evident before sensory input corresponding to the stimulus position is able to arrive. Finally, we demonstrate that, when predicted events do not eventuate, sensory information arrives too late to prevent the visual system from representing what was expected but never presented. Taken together, we demonstrate how the visual system can implement predictive mechanisms to preactivate sensory representations, and argue that this might allow it to compensate for its own temporal constraints, allowing us to interact with dynamic visual environments in real time.


2018 ◽  
Vol 120 (5) ◽  
pp. 2164-2181
Author(s):  
Kristin M. Quick ◽  
Jessica L. Mischel ◽  
Patrick J. Loughlin ◽  
Aaron P. Batista

Everyday behaviors require that we interact with the environment, using sensory information in an ongoing manner to guide our actions. Yet, by design, many of the tasks used in primate neurophysiology laboratories can be performed with limited sensory guidance. As a consequence, our knowledge about the neural mechanisms of motor control is largely limited to the feedforward aspects of the motor command. To study the feedback aspects of volitional motor control, we adapted the critical stability task (CST) from the human performance literature (Jex H, McDonnell J, Phatak A. IEEE Trans Hum Factors Electron 7: 138–145, 1966). In the CST, our monkey subjects interact with an inherently unstable (i.e., divergent) virtual system and must generate sensory-guided actions to stabilize it about an equilibrium point. The difficulty of the CST is determined by a single parameter, which allows us to quantitatively establish the limits of performance in the task for different sensory feedback conditions. Two monkeys learned to perform the CST with visual or vibrotactile feedback. Performance was better under visual feedback, as expected, but both monkeys were able to utilize vibrotactile feedback alone to successfully perform the CST. We also observed changes in behavioral strategy as the task became more challenging. The CST will have value for basic science investigations of the neural basis of sensory-motor integration during ongoing actions, and it may also provide value for the design and testing of bidirectional brain computer interface systems. NEW & NOTEWORTHY Currently, most behavioral tasks used in motor neurophysiology studies require primates to make short-duration, stereotyped movements that do not necessitate sensory feedback. To improve our understanding of sensorimotor integration, and to engineer meaningful artificial sensory feedback systems for brain-computer interfaces, it is crucial to have a task that requires sensory feedback for good control. The critical stability task demands that sensory information be used to guide long-duration movements.


2016 ◽  
Vol 114 (2) ◽  
pp. 412-417 ◽  
Author(s):  
Neil W. Roach ◽  
Paul V. McGraw ◽  
David J. Whitaker ◽  
James Heron

To enable effective interaction with the environment, the brain combines noisy sensory information with expectations based on prior experience. There is ample evidence showing that humans can learn statistical regularities in sensory input and exploit this knowledge to improve perceptual decisions and actions. However, fundamental questions remain regarding how priors are learned and how they generalize to different sensory and behavioral contexts. In principle, maintaining a large set of highly specific priors may be inefficient and restrict the speed at which expectations can be formed and updated in response to changes in the environment. However, priors formed by generalizing across varying contexts may not be accurate. Here, we exploit rapidly induced contextual biases in duration reproduction to reveal how these competing demands are resolved during the early stages of prior acquisition. We show that observers initially form a single prior by generalizing across duration distributions coupled with distinct sensory signals. In contrast, they form multiple priors if distributions are coupled with distinct motor outputs. Together, our findings suggest that rapid prior acquisition is facilitated by generalization across experiences of different sensory inputs but organized according to how that sensory information is acted on.


Sign in / Sign up

Export Citation Format

Share Document