The Effects of Optical Illusions in Perception and Action in Peripersonal and Extrapersonal Space

Perception ◽  
2017 ◽  
Vol 46 (9) ◽  
pp. 1118-1126 ◽  
Author(s):  
Jaeho Shim ◽  
John van der Kamp

While the two visual system hypothesis tells a fairly compelling story about perception and action in peripersonal space (i.e., within arm’s reach), its validity for extrapersonal space is very limited and highly controversial. Hence, the present purpose was to assess whether perception and action differences in peripersonal space hold in extrapersonal space and are modulated by the same factors. To this end, the effects of an optic illusion in perception and action in both peripersonal and extrapersonal space were compared in three groups that threw balls toward a target at a distance under different target eccentricity (i.e., with the target fixated and in peripheral field), viewing (i.e., binocular and monocular viewing), and delay conditions (i.e., immediate and delayed action). The illusory bias was smaller in action than in perception in peripersonal space, but this difference was significantly reduced in extrapersonal space, primarily because of a weakening bias in perception. No systematic modulation of target eccentricity, viewing, and delay arose. The findings suggest that the two visual system hypothesis is also valid for extra personal space.

2021 ◽  
Vol 22 (S1) ◽  
pp. 121-126
Author(s):  
Anna Berti

AbstractYears ago, it was demonstrated (e.g., Rizzolatti et al. in Handbook of neuropsychology, Elsevier Science, Amsterdam, 2000) that the brain does not encode the space around us in a homogeneous way, but through neural circuits that map the space relative to the distance that objects of interest have from the body. In monkeys, relatively discrete neural systems, characterized by neurons with specific neurophysiological responses, seem to be dedicated either to represent the space that can be reached by the hand (near/peripersonal space) or to the distant space (far/extrapersonal space). It was also shown that the encoding of spaces has dynamic aspects because they can be remapped by the use of tools that trigger different actions (e.g., Iriki et al. 1998). In this latter case, the effect of the tool depends on the modulation of personal space, that is the space of our body. In this paper, I will review and discuss selected research, which demonstrated that also in humans: 1 spaces are encoded in a dynamic way; 2 encoding can be modulated by the use of tool that the system comes to consider as parts of the own body; 3 body representations are not fixed, but they are fragile and subject to change to the point that we can incorporate not only the tools necessary for action, but even limbs belonging to other people. What embodiment of tools and of alien limb tell us about body representations is then briefly discussed.


Author(s):  
Frédérique de Vignemont ◽  
Andrea Serino ◽  
Hong Yu Wong ◽  
Alessandro Farnè

Research in cognitive neuroscience indicates that we process the space surrounding our body in a specific way, both for protecting our body from immediate danger and for interacting with the environment. This research has direct implications for philosophical issues as diverse as self-location, sensorimotor theories of perception, and affective perception. This chapter briefly describes the overall directions that some of these discussions might take. But, beforehand, it is important to fully grasp what the notion of peripersonal space involves. One of the most difficult questions that the field has had to face these past 30 years is to define peripersonal space. Although it bears some relations to the social notion of personal space, to the sensorimotor notion of reaching space and to the spatial notion of egocentric space, there is something unique about peripersonal space and the special way we represent it. One of the main challenges is thus to offer a satisfactory definition of peripersonal space that is specific enough to account for its peculiar spatial, multisensory, plastic, and motor properties. Emphasis can be put on perception or on action, but also on impact prediction or defence preparation. However, each new definition brings with it new methods to experimentally investigate peripersonal space. There is then the risk of losing the unity of the notion of peripersonal space within this multiplicity of conceptions and methods. This chapter offers an overview of the key notions in the field, the way they have been operationalized, and the questions they leave open.


2002 ◽  
Vol 25 (1) ◽  
pp. 103-104 ◽  
Author(s):  
Denise D. J. de Grave ◽  
Jeroen B. J. Smeets ◽  
Eli Brenner

Norman tries to link the ecological and constructivist approaches to the dorsal and ventral pathways of the visual system. Such a link implies that the distinction is not only one of approach, but that different issues are studied. Norman identifies these issues as perception and action. The influence of contextual illusions is critical for Norman's arguments. We point out that fast (dorsal) actions can be fooled by contextual illusions while (ventral) perceptual judgements can be insensitive to them. We conclude that both approaches can, in principle, be used to study visual information processing in both pathways.


2001 ◽  
Vol 13 (6) ◽  
pp. 569-574
Author(s):  
Masanori Idesawa ◽  

Human beings obtain big amount of information from the external world through their visual system. Automated system such as robot must provide the visual functions for their flexible operations in 3-D circumstances. In order to realize the visual function artificially, we would be better to learn from the human visual mechanism. Optical illusions would be a pure reflection of the human visual mechanism; they can be used for investigating human visual mechanism. New types of optical illusion with binocular viewing are introduced and investigated.


1998 ◽  
Vol 10 (5) ◽  
pp. 581-589 ◽  
Author(s):  
Elisabetta Làdavas ◽  
Giuseppe di Pellegrino ◽  
Alessandro Farnè ◽  
Gabriele Zeloni

Current interpretations of extinction suggest that the disorder is due to an unbalanced competition between ipsilesional and contralesional representations of space. The question addressed in this study is whether the competition between left and right representations of space in one sensory modality (i.e., touch) can be reduced or exacerbated by the activation of an intact spatial representation in a different modality that is functionally linked to the damaged representation (i.e., vision). This hypothesis was tested in 10 right-hemisphere lesioned patients who suffered from reliable tactile extinction. We found that a visual stimulus presented near the patient's ipsilesional hand (i.e., visual peripersonal space) inhibited the processing of a tactile stimulus delivered on the contralesional hand (cross-modal visuotactile extinction) to the same extent as did an ipsilesional tactile stimulation (unimodal tactile extinction). It was also found that a visual stimulus presented near the contralesional hand improved the detection of a tactile stimulus applied to the same hand. In striking contrast, less modulatory effects of vision on touch perception were observed when a visual stimulus was presented far from the space immediately around the patient's hand (i.e., extrapersonal space). This study clearly demonstrates the existence of a visual peripersonal space centered on the hand in humans and its modulatory effects on tactile perception. These findings are explained by referring to the activity of bimodal neurons in premotor and parietal cortex of macaque, which have tactile receptive fields on the hand and corresponding visual receptive fields in the space immediately adjacent to the tactile fields.


2018 ◽  
Author(s):  
Justine Cléry ◽  
Olivier Guipponi ◽  
Soline Odouard ◽  
Claire Wardak ◽  
Suliann Ben Hamed

AbstractWhile extra-personal space is often erroneously considered as a unique entity, early neuropsychological studies report a dissociation between near and far space processing both in humans and in monkeys. Here, we use functional MRI in a naturalistic 3D environment to describe the non-human primate near and far space cortical networks. We describe the co-occurrence of two extended functional networks respectively dedicated to near and far space processing. Specifically, far space processing involves occipital, temporal, parietal, posterior cingulate as well as orbitofrontal regions not activated by near space, possibly subserving the processing of the shape and identity of objects. In contrast, near space processing involves temporal, parietal and prefrontal regions not activated by far space, possibly subserving the preparation of an arm/hand mediated action in this proximal space. Interestingly, this network also involves somatosensory regions, suggesting a cross-modal anticipation of touch by a nearby object. Last, we also describe cortical regions that process both far and near space with a preference for one or the other. This suggests a continuous encoding of relative distance to the body, in the form of a far-to-near gradient. We propose that these cortical gradients in space representation subserve the physically delineable peripersonal spaces described in numerous psychology and psychophysics studies.HighlightsNear space processing involves temporal, parietal and prefrontal regions.Far space activates occipital, temporal, parietal, cingulate & orbitofrontal areas.Most regions process both far & near space, with a preference for one or the other.Far-to-near gradient may subserve behavioral changes in peripersonal space size.


Author(s):  
Barbara Gillam

The geometrical optical illusions, such as the Müller-Lyer and the Poggendorff, are simple line drawings, which demonstrate errors as large as 25% when people are asked to match their properties such as size, angles, and line collinearity. They have been tantalizing people for at least 150 years and are still not really understood. Illusion figures have been used to probe the consistency of different perceptual properties and also of perception and action with implications for the theory of two visual systems. Explanations of geometrical illusions tend to invoke either physiological processes or the functional role illusion responses may have when viewing a 3D scene. This chapter examines all of these theoretical issues, discussing evidence for and against the major theories.


2021 ◽  
pp. 101-116
Author(s):  
Catherine L. Reed ◽  
George D. Park

Human perceptual and attentional systems operate to help us perform functional and adaptive actions in the world around us. In this review, we consider different regions of peripersonal space—peri-hand space, reachable space, and tool space when used in both peri- and extrapersonal space. Focusing on behavioural and electrophysiology/event-related potentials (EEG/ERP) studies using comparable target detection paradigms, we examine how visuospatial attention is facilitated or differentiated due to the current proximity and functional capabilities of our hands and the tools we hold in them. The functionality of the hand and tool is defined by the action goals of the user and the available functional affordances or parts available to achieve the goals. Finally, we report recent tool-use studies examining how the distribution of attention to tool space can change as a result of tool functionality and directional action crossing peripersonal and extrapersonal space boundaries. We propose that the functional capabilities of the hand and tools direct attention to action-relevant regions of peripersonal space. Although neural mechanisms such as bimodal neurons may enhance the processing of visual information presented in near-hand regions of peripersonal space, functional experience and the relevance of the space for upcoming actions more strongly direct attention within regions of peripersonal space. And, while some aspects of functionality can be extended into extrapersonal space, the multimodal nature of peripersonal space allows it to be more modifiable in the service of action.


Author(s):  
Michael J. Proulx ◽  
David J. Brown ◽  
Achille Pasqualotto

Vision is the default sensory modality for normal spatial navigation in humans. Touch is restricted to providing information about peripersonal space, whereas detecting and avoiding obstacles in extrapersonal space is key for efficient navigation. Hearing is restricted to the detection of objects that emit noise, yet many obstacles such as walls are silent. Sensory substitution devices provide a means of translating distal visual information into a form that visually impaired individuals can process through either touch or hearing. Here we will review findings from various sensory substitution systems for the processing of visual information that can be classified as what (object recognition), where (localization), and how (perception for action) processing. Different forms of sensory substitution excel at some tasks more than others. Spatial navigation brings together these different forms of information and provides a useful model for comparing sensory substitution systems, with important implications for rehabilitation, neuroanatomy, and theories of cognition.


Sign in / Sign up

Export Citation Format

Share Document