scholarly journals From embodying tool to embodying alien limb: sensory-motor modulation of personal and extrapersonal space

2021 ◽  
Vol 22 (S1) ◽  
pp. 121-126
Author(s):  
Anna Berti

AbstractYears ago, it was demonstrated (e.g., Rizzolatti et al. in Handbook of neuropsychology, Elsevier Science, Amsterdam, 2000) that the brain does not encode the space around us in a homogeneous way, but through neural circuits that map the space relative to the distance that objects of interest have from the body. In monkeys, relatively discrete neural systems, characterized by neurons with specific neurophysiological responses, seem to be dedicated either to represent the space that can be reached by the hand (near/peripersonal space) or to the distant space (far/extrapersonal space). It was also shown that the encoding of spaces has dynamic aspects because they can be remapped by the use of tools that trigger different actions (e.g., Iriki et al. 1998). In this latter case, the effect of the tool depends on the modulation of personal space, that is the space of our body. In this paper, I will review and discuss selected research, which demonstrated that also in humans: 1 spaces are encoded in a dynamic way; 2 encoding can be modulated by the use of tool that the system comes to consider as parts of the own body; 3 body representations are not fixed, but they are fragile and subject to change to the point that we can incorporate not only the tools necessary for action, but even limbs belonging to other people. What embodiment of tools and of alien limb tell us about body representations is then briefly discussed.

2004 ◽  
Vol 27 (3) ◽  
pp. 377-396 ◽  
Author(s):  
Rick Grush

The emulation theory of representation is developed and explored as a framework that can revealingly synthesize a wide variety of representational functions of the brain. The framework is based on constructs from control theory (forward models) and signal processing (Kalman filters). The idea is that in addition to simply engaging with the body and environment, the brain constructs neural circuits that act as models of the body and environment. During overt sensorimotor engagement, these models are driven by efference copies in parallel with the body and environment, in order to provide expectations of the sensory feedback, and to enhance and process sensory information. These models can also be run off-line in order to produce imagery, estimate outcomes of different actions, and evaluate and develop motor plans. The framework is initially developed within the context of motor control, where it has been shown that inner models running in parallel with the body can reduce the effects of feedback delay problems. The same mechanisms can account for motor imagery as the off-line driving of the emulator via efference copies. The framework is extended to account for visual imagery as the off-line driving of an emulator of the motor-visual loop. I also show how such systems can provide for amodal spatial imagery. Perception, including visual perception, results from such models being used to form expectations of, and to interpret, sensory input. I close by briefly outlining other cognitive functions that might also be synthesized within this framework, including reasoning, theory of mind phenomena, and language.


2018 ◽  
Author(s):  
Axel Davies Vittersø ◽  
Monika Halicka ◽  
Gavin Buckingham ◽  
Michael J Proulx ◽  
Mark Wilson ◽  
...  

Representations of the body and peripersonal space can be distorted for people with some chronic pain conditions. Experimental pain induction can give rise to similar, but transient distortions in healthy individuals. However, spatial and bodily representations are dynamic, and constantly update as we interact with objects in our environment. It is unclear whether induced pain disrupts the mechanisms involved in updating these representations. In the present study, we sought to investigate the effect of induced pain on the updating of peripersonal space and body representations during and following tool-use. We compared performance under three conditions (pain, active placebo, neutral) on a visuotactile crossmodal congruency task and a tactile distance judgement task to measure updating of peripersonal space and body representations, respectively. We induced pain by applying 1% capsaicin cream to the arm, and for placebo we used a gel that induced non-painful warming. Consistent with previous findings, the difference in crossmodal interference from visual distractors in the same compared to opposite visual field to the tactile target was less when tools were crossed than uncrossed. This suggests an extension of peripersonal space to incorporate the tips of the tools. Also consistent with previous findings, estimates of the felt distance between two points (tactile distance judgements) decreased after active tool-use. In contrast to our predictions, however, we found no evidence that pain interfered with performance on either task when compared to the control conditions. This suggests that the updating of peripersonal space and body representations is not disrupted by induced pain. Therefore, acute pain does not account for the distorted representations of the body and peripersonal space that can endure in people with chronic pain conditions.


2018 ◽  
Author(s):  
Justine Cléry ◽  
Olivier Guipponi ◽  
Soline Odouard ◽  
Claire Wardak ◽  
Suliann Ben Hamed

AbstractWhile extra-personal space is often erroneously considered as a unique entity, early neuropsychological studies report a dissociation between near and far space processing both in humans and in monkeys. Here, we use functional MRI in a naturalistic 3D environment to describe the non-human primate near and far space cortical networks. We describe the co-occurrence of two extended functional networks respectively dedicated to near and far space processing. Specifically, far space processing involves occipital, temporal, parietal, posterior cingulate as well as orbitofrontal regions not activated by near space, possibly subserving the processing of the shape and identity of objects. In contrast, near space processing involves temporal, parietal and prefrontal regions not activated by far space, possibly subserving the preparation of an arm/hand mediated action in this proximal space. Interestingly, this network also involves somatosensory regions, suggesting a cross-modal anticipation of touch by a nearby object. Last, we also describe cortical regions that process both far and near space with a preference for one or the other. This suggests a continuous encoding of relative distance to the body, in the form of a far-to-near gradient. We propose that these cortical gradients in space representation subserve the physically delineable peripersonal spaces described in numerous psychology and psychophysics studies.HighlightsNear space processing involves temporal, parietal and prefrontal regions.Far space activates occipital, temporal, parietal, cingulate & orbitofrontal areas.Most regions process both far & near space, with a preference for one or the other.Far-to-near gradient may subserve behavioral changes in peripersonal space size.


2021 ◽  
pp. 315-330
Author(s):  
Michael S.A. Graziano

The brain evolved to give special representation to the space immediately around the body. One of the most obvious adaptive uses of that peripersonal space is self-protection. It is a safety buffer zone, and intrusions can trigger a suite of protective behaviours. Perhaps less obvious is the possible relationship between that complex protective mechanism and social signalling. Standing tall, cringing, power poses and handshakes, even coquettish tilts of the head that expose the neck, may all relate in some manner to that safety buffer, signalling to others that one’s protective mechanisms are heightened (when anxious) or reduced (when confident). Here I propose that some of our most fundamental human emotional expressions such as smiling, laughing, and crying may also have a specific evolutionary relationship to the buffer zone around the body, deriving ultimately from the reflexive actions that protect us.


2012 ◽  
Vol 25 (0) ◽  
pp. 88
Author(s):  
Annick De Paepe ◽  
Valéry Legrain ◽  
Geert Crombez

Localizing pain not only requires a simple somatotopic representation of the body, but also knowledge about the limb position (i.e., proprioception), and a visual localization of the pain source in external space. Therefore, nociceptive events are remapped into a multimodal representation of the body and the space nearby (i.e., a peripersonal schema of the body). We investigated the influence of visual cues presented either in peripersonal, or in extrapersonal space on the localization of nociceptive stimuli in a temporal order judgement (TOJ) task. 24 psychology students made TOJs concerning which of two nociceptive stimuli (one applied to each hand) had been presented first (or last). A spatially non-predictive visual cue (i.e., lighting of a LED) preceded (80 ms) the nociceptive stimuli. This cue was presented randomly either on the hand of the participant (in peripersonal space), or 70 cm in front of the hand (in extrapersonal space), and either on the left or on the right side of space. Biases in spatial attention are reflected by the point of subjective simultaneity (PSS). The results revealed that TOJs were more biased towards the visual cue in peripersonal space in comparison with the visual cue in extrapersonal space. This study provides evidence for the crossmodal integration of visual and nociceptive stimuli in a peripersonal schema of the body. Future research with this paradigm will explore crossmodal attention deficits in chronic pain populations.


2000 ◽  
Vol 355 (1404) ◽  
pp. 1685-1754 ◽  
Author(s):  
I.M.L. Donaldson

This article sets out to present a fairly comprehensive review of our knowledge about the functions of the receptors that have been found in the extraocular muscles – the six muscles that move each eye of vertebrates in its orbit – of all the animals in which they have been sought, including Man. Since their discovery at the beginning of the 20th century these receptors have, at various times, been credited with important roles in the control of eye movement and the construction of extrapersonal space and have also been denied any function whatsoever. Experiments intended to study the actions of eye muscle receptors and, even more so, opinions (and indeed polemic) derived from these observations have been influenced by the changing fashions and beliefs about the more general question of how limb position and movement is detected by the brain and which signals contribute to those aspects of this that are p erceived (kinaesthesis). But the conclusions drawn from studies on the eye have also influenced beliefs about the mechanisms of kinaesthesis and, arguably, this influence has been even larger than that in the converse direction. Experimental evidence accumulated over rather more than a century is set out and discussed. It supports the view that, at the beginning of the 21st century, there are excellent grounds for believing that the receptors in the extraocular muscles are indeed proprioceptors, that is to say that the signals that they send into the brain are used to provide information about the position and movement of the eye in the orbit. It seems that this information is important in the control of eye movements of at least some types, and in the determination by the brain of the direction of gaze and the relationship of the organism to its environment. In addition, signals from these receptors in the eye muscles are seen to be necessary for the development of normal mechanisms of visual analysis in the mammalian visual cortex and for both the development and maintenance of normal visuomotor behaviour. Man is among those vertebrates to whose brains eye muscle proprioceptive signals provide information apparently used in normal sensorimotor functions; these include various aspects of perception, and of the control of eye movement. It is possible that abnormalities of the eye muscle proprioceptors and their signals may play a part in the genesis of some types of human squint (strabismus); conversely studies of patients with squint in the course of their surgical or pharmacological treatment have yielded much interesting evidence about the central actions of the proprioceptive signals from the extraocular muscles. The results of experiments on the eye have played a large part in the historical controversy, now in at least its third century, about the origin of signals that inform the brain about movement of parts of the body. Some of these results, and more of the interpretations of them, now need to be critically re–examined. The re–examination in the light of recent experiments that is presented here does not support many of the conclusions confidently drawn in the past and leads to both new insights and fresh questions about the roles of information from motor signals flowing out of the brain and that from signals from the peripheral receptors flowing into it. There remain many lacunae in our knowledge and filling some of these will, it is contended, be essential to advance our understanding further. It is argued that such understanding of eye muscle proprioception is a necessary part of the understanding of the physiology and pathophysiology of eye movement control and that it is also essential to an account of how organisms, including Man, build and maintain knowledge of their relationship to the external visual world. The eye would seem to provide a uniquely favourable system in which to study the way in which information derived within the brain about motor actions may interact with signals flowing in from peripheral receptors. The review is constructed in relatively independent sections that deal with particular topics. It ends with a fairly brief piece in which the author sets out some personal views about what has been achieved recently and what most immediately needs to be done. It also suggests some lines of study that appear to the author to be important for the future.


2015 ◽  
Vol 112 (29) ◽  
pp. 9118-9122 ◽  
Author(s):  
Andrew S. Fox ◽  
Jonathan A. Oler ◽  
Alexander J. Shackman ◽  
Steven E. Shelton ◽  
Muthuswamy Raveendran ◽  
...  

Understanding the heritability of neural systems linked to psychopathology is not sufficient to implicate them as intergenerational neural mediators. By closely examining how individual differences in neural phenotypes and psychopathology cosegregate as they fall through the family tree, we can identify the brain systems that underlie the parent-to-child transmission of psychopathology. Although research has identified genes and neural circuits that contribute to the risk of developing anxiety and depression, the specific neural systems that mediate the inborn risk for these debilitating disorders remain unknown. In a sample of 592 young rhesus monkeys that are part of an extended multigenerational pedigree, we demonstrate that metabolism within a tripartite prefrontal-limbic-midbrain circuit mediates some of the inborn risk for developing anxiety and depression. Importantly, although brain volume is highly heritable early in life, it is brain metabolism—not brain structure—that is the critical intermediary between genetics and the childhood risk to develop stress-related psychopathology.


2001 ◽  
Vol 13 (2) ◽  
pp. 181-189 ◽  
Author(s):  
Sandeep Vaishnavi ◽  
Jesse Calhoun ◽  
Anjan Chatterjee

Behavioral and neurophysiological studies suggest that the brain constructs different representations of space. Among these representations are personal and peripersonal space. Personal space refers to the space occupied by our bodies. Peripersonal space refers to the space surrounding our bodies, which can be reached by our limbs. How these two representations are bound to give a unified sense of space in which humans act is not clear. We tested 10 patients with tactile extinction to investigate this issue. Tactile extinction is an attentional disorder in which patients are unaware of being touched on their contralesional limb if they are also touched simultaneously on their ipsilesional limb. We hypothesized that mechanisms that bind personal and peripersonal representations would improve these patients' awareness of being touched on their contralesional limbs. Visual-tactile integration and intentional movements were considered candidate mechanisms. Patients were more likely to be aware of contralesional touch when looking towards their contralesional limb than when looking towards their ipsilesional limb, and when actively moving on tactile probes than when receiving tactile stimuli passively. The improved awareness of being touched on the contralesional limb under these conditions suggests that cross-sensory and sensorimotor integration help bind personal and peripersonal space.


2021 ◽  
pp. 152-180
Author(s):  
Matej Hoffmann

Humans and animals excel in combining information from multiple sensory modalities, controlling their complex bodies, adapting to growth or failures, or using tools. The key foundation is an internal representation of the body that the agent—human, animal, or robot—has developed. In the biological realm, evidence has been accumulating in diverse disciplines, giving rise to the concepts of body image, body schema, and others. In robotics, a model of the robot is an indispensable component that enables to control the machine. This chapter compares the character of body representations in biology with their robotic counterparts and relates that to the differences in performance observed. Conclusions are drawn about how robots can inform the biological sciences dealing with body representations and which of the features of the ‘body in the brain’ should be transferred to robots, giving rise to more adaptive and resilient self-calibrating machines.


2021 ◽  
Author(s):  
Belle Liu ◽  
Arthur Hong ◽  
Fred Rieke ◽  
Michael B. Manookin

Successful behavior relies on the ability to use information obtained from past experience to predict what is likely to occur in the future. A salient example of predictive encoding comes from the vertebrate retina, where neural circuits encode information that can be used to estimate the trajectory of a moving object. Predictive computations should be a general property of sensory systems, but the features needed to identify these computations across neural systems are not well understood. Here, we identify several properties of predictive computations in the primate retina that likely generalize across sensory systems. These features include calculating the derivative of incoming signals, sparse signal integration, and delayed response suppression. These findings provide a deeper understanding of how the brain carries out predictive computations and identify features that can be used to recognize these computations throughout the brain.


Sign in / Sign up

Export Citation Format

Share Document