scholarly journals Multiple Spatial Coordinates Influence the Prediction of Tactile Events Facilitated by Approaching Visual Stimuli

2021 ◽  
pp. 1-21
Author(s):  
Tsukasa Kimura

Abstract Interaction with other sensory information is important for prediction of tactile events. Recent studies have reported that the approach of visual information toward the body facilitates prediction of subsequent tactile events. However, the processing of tactile events is influenced by multiple spatial coordinates, and it remains unclear how this approach effect influences tactile events in different spatial coordinates, i.e., spatial reference frames. We investigated the relationship between the prediction of a tactile stimulus via this approach effect and spatial coordinates by comparing ERPs. Participants were asked to place their arms on a desk and required to respond tactile stimuli which were presented to the left (or right) index finger with a high probability (80%) or to the opposite index finger with a low probability (20%). Before the presentation of each tactile stimulus, visual stimuli approached sequentially toward the hand to which the high-probability tactile stimulus was presented. In the uncrossed condition, each hand was placed on the corresponding side. In the crossed condition, each hand was crossed and placed on the opposite side, i.e., left (right) hand placed on the right (left) side. Thus, the spatial location of the tactile stimulus and hand was consistent in the uncrossed condition and inconsistent in the crossed condition. The results showed that N1 amplitudes elicited by high-probability tactile stimuli only decreased in the uncrossed condition. These results suggest that the prediction of a tactile stimulus facilitated by approaching visual information is influenced by multiple spatial coordinates.

2013 ◽  
Vol 25 (5) ◽  
pp. 790-801 ◽  
Author(s):  
Chiara Renzi ◽  
Patrick Bruns ◽  
Kirstin-Friederike Heise ◽  
Maximo Zimerman ◽  
Jan-Frederik Feldheim ◽  
...  

Previous studies have suggested that the putative human homologue of the ventral intraparietal area (hVIP) is crucially involved in the remapping of tactile information into external spatial coordinates and in the realignment of tactile and visual maps. It is unclear, however, whether hVIP is critical for the remapping process during audio-tactile cross-modal spatial interactions. The audio-tactile ventriloquism effect, where the perceived location of a sound is shifted toward the location of a synchronous but spatially disparate tactile stimulus, was used to probe spatial interactions in audio-tactile processing. Eighteen healthy volunteers were asked to report the perceived location of brief auditory stimuli presented from three different locations (left, center, and right). Auditory stimuli were presented either alone (unimodal stimuli) or concurrently to a spatially discrepant tactile stimulus applied to the left or right index finger (bimodal stimuli), with the hands adopting either an uncrossed or a crossed posture. Single pulses of TMS were delivered over the hVIP or a control site (primary somatosensory cortex, SI) 80 msec after trial onset. TMS to the hVIP, compared with the control SI-TMS, interfered with the remapping of touch into external space, suggesting that hVIP is crucially involved in transforming spatial reference frames across audition and touch.


1998 ◽  
Vol 10 (6) ◽  
pp. 680-690 ◽  
Author(s):  
H.-O. Karnath ◽  
M. Fetter ◽  
M. Niemeier

Previous studies in neglect patients using rotation of the body around the roll-axis revealed neglect of visual stimuli not only in the egocentric, body-centered left but also in the environmental left. The latter has been taken as evidence for a gravity-based environment-centered component of neglect occurring independently of the subject's actual body orientation. However, by using visual stimuli in a normally lightened room, the studies confounded the gravitational upright with the visible upright of the surround. Thus, it is possible that the visible upright of the environment may have served the role of the gravitational upright relative to which neglect occurred. The present experiment evaluated the influence of gravity on contralateral neglect when no visual information was presented. In complete darkness, neglect patients' exploratory eye movements were recorded in five experimental conditions: body in normal upright position, body tilted 30° to the left and 30° to the right, and body pitched 30° backward and 30° forward. In the upright orientation, the patients with neglect showed a bias of ocular exploration to the ipsilesional right side. In egocentric body coordinates, we found no significant differences between the orientation of the biased search field in the different experimental conditions showing that the search field shifted with the orientation of the body. No significant decrease or enhancement of neglect was observed when body orientation was varied in the different conditions. In conclusion, the present results revealed that the modulation of gravitational forces has no significant influence on the exploratory bias of these patients. When visual information was excluded and only graviceptive information was available, the patients' failure to explore the contralesional part of space appeared purely body-centered. The results argue against a disturbed representation of space in neglect that encodes locations in a gravity-based reference system.


2021 ◽  
pp. 1-29
Author(s):  
Lisa Lorentz ◽  
Kaian Unwalla ◽  
David I. Shore

Abstract Successful interaction with our environment requires accurate tactile localization. Although we seem to localize tactile stimuli effortlessly, the processes underlying this ability are complex. This is evidenced by the crossed-hands deficit, in which tactile localization performance suffers when the hands are crossed. The deficit results from the conflict between an internal reference frame, based in somatotopic coordinates, and an external reference frame, based in external spatial coordinates. Previous evidence in favour of the integration model employed manipulations to the external reference frame (e.g., blindfolding participants), which reduced the deficit by reducing conflict between the two reference frames. The present study extends this finding by asking blindfolded participants to visually imagine their crossed arms as uncrossed. This imagery manipulation further decreased the magnitude of the crossed-hands deficit by bringing information in the two reference frames into alignment. This imagery manipulation differentially affected males and females, which was consistent with the previously observed sex difference in this effect: females tend to show a larger crossed-hands deficit than males and females were more impacted by the imagery manipulation. Results are discussed in terms of the integration model of the crossed-hands deficit.


2017 ◽  
Vol 118 (3) ◽  
pp. 1650-1663 ◽  
Author(s):  
Jan Churan ◽  
Johannes Paul ◽  
Steffen Klingenhoefer ◽  
Frank Bremmer

In the natural world, self-motion always stimulates several different sensory modalities. Here we investigated the interplay between a visual optic flow stimulus simulating self-motion and a tactile stimulus (air flow resulting from self-motion) while human observers were engaged in a distance reproduction task. We found that adding congruent tactile information (i.e., speed of the air flow and speed of visual motion are directly proportional) to the visual information significantly improves the precision of the actively reproduced distances. This improvement, however, was smaller than predicted for an optimal integration of visual and tactile information. In contrast, incongruent tactile information (i.e., speed of the air flow and speed of visual motion are inversely proportional) did not improve subjects’ precision indicating that incongruent tactile information and visual information were not integrated. One possible interpretation of the results is a link to properties of neurons in the ventral intraparietal area that have been shown to have spatially and action-congruent receptive fields for visual and tactile stimuli. NEW & NOTEWORTHY This study shows that tactile and visual information can be integrated to improve the estimates of the parameters of self-motion. This, however, happens only if the two sources of information are congruent—as they are in a natural environment. In contrast, an incongruent tactile stimulus is still used as a source of information about self-motion but it is not integrated with visual information.


2017 ◽  
Author(s):  
Miriam L. R. Meister ◽  
Elizabeth A. Buffalo

AbstractPrimates predominantly rely on vision to gather information from the environment, and neurons representing visual space and gaze position are found in many brain areas. Within the medial temporal lobe, a brain region critical for memory, neurons in the entorhinal cortex of macaque monkeys exhibit spatial selectivity for gaze position. Specifically, the firing rate of single neurons reflects fixation location within a visual image (Killian et al., 2012). In the rodents, entorhinal cells such as grid cells, border cells, and head direction cells show spatial representations aligned to visual environmental features instead of the body (Hafting et al., 2005, Solstad et al. 2008, Sargolini et al., 2006, Diehl et al., 2017). However, it is not known whether similar allocentric representations exist in primate entorhinal cortex. Here, we recorded neural activity in the entorhinal cortex in two male rhesus monkeys during a naturalistic, free-viewing task. Our data reveal that a majority of entorhinal neurons represent gaze position, and that simultaneously recorded neurons exhibit distinct spatial reference frames, with some neurons aligning to the visual image and others aligning to the monkey’s head position. Our results also show that entorhinal neural activity can be used to predict gaze position with a high degree of accuracy. These findings demonstrate that visuospatial representation is a fundamental property of entorhinal neurons in primates, and suggest that entorhinal cortex may support relational memory and motor planning by coding attentional locus in distinct, behaviorally relevant frames of reference.Significance StatementThe entorhinal cortex, a brain area important for memory, shows striking spatial activity in rodents through grid cells, border cells, head direction cells, and nongrid spatial cells. The majority of entorhinal neurons signal the location of a rodent relative to visual environmental cues, representing the location of the animal relative to space in the world instead of the body. Recently, our laboratory found that entorhinal neurons can signal location of gaze while a monkey visually explores images. Here, we report that spatial entorhinal neurons are widespread in the monkey, and these neurons are capable of showing a world-based spatial reference frame locked to the bounds of explored images. These results help connect the extensive findings in rodents to the primate.


2017 ◽  
Vol 117 (6) ◽  
pp. 2262-2268 ◽  
Author(s):  
Hanna Gertz ◽  
Dimitris Voudouris ◽  
Katja Fiehler

Tactile stimuli on moving limbs are typically attenuated during reach planning and execution. This phenomenon has been related to internal forward models that predict the sensory consequences of a movement. Tactile suppression is considered to occur due to a match between the actual and predicted sensory consequences of a movement, which might free capacities to process novel or task-relevant sensory signals. Here, we examined whether and how tactile suppression depends on the relevance of somatosensory information for reaching. Participants reached with their left or right index finger to the unseen index finger of their other hand (body target) or an unseen pad on a screen (external target). In the body target condition, somatosensory signals from the static hand were available for localizing the reach target. Vibrotactile stimuli were presented on the moving index finger before or during reaching or in a separate no-movement baseline block, and participants indicated whether they detected a stimulus. As expected, detection thresholds before or during reaching were higher compared with baseline. Tactile suppression was also stronger for reaches to body targets than external targets, as reflected by higher detection thresholds and lower precision of detectability. Moreover, detection thresholds were higher when reaching with the left than with the right hand. Our results suggest that tactile suppression is modulated by position signals from the target limb that are required to reach successfully to the own body. Moreover, limb dominance seems to affect tactile suppression, presumably due to disparate uncertainty of feedback signals from the moving limb. NEW & NOTEWORTHY Tactile suppression on a moving limb has been suggested to release computational resources for processing other relevant sensory events. In the current study, we show that tactile sensitivity on the moving limb decreases more when reaching to body targets than external targets. This indicates that tactile perception can be modulated by allocating processing capacities to movement-relevant somatosensory information at the target location. Our results contribute to understanding tactile processing and predictive mechanisms in the brain.


2010 ◽  
Vol 179 (3) ◽  
pp. 253-258 ◽  
Author(s):  
Shahrzad Mazhari ◽  
Johanna C. Badcock ◽  
Flavie A. Waters ◽  
Milan Dragović ◽  
David R. Badcock ◽  
...  

2011 ◽  
Vol 11 ◽  
pp. 199-213 ◽  
Author(s):  
Chiara F. Sambo ◽  
Bettina Forster

Sustained attention to a body location results in enhanced processing of tactile stimuli presented at that location compared to another unattended location. In this paper, we review studies investigating the neural correlates of sustained spatial attention in touch. These studies consistently show that activity within modality-specific somatosensory areas (SI and SII) is modulated by sustained tactile-spatial attention. Recent evidence suggests that these somatosensory areas may be recruited as part of a larger cortical network,also including higher-level multimodal regions involved in spatial selection across modalities. We discuss, in turn, the following multimodal effects in sustained tactile-spatial attention tasks. First, cross-modal attentional links between touch and vision, reflected in enhanced processing of task-irrelevant visual stimuli at tactuallyattended locations, are mediated by common (multimodal) representations of external space. Second, vision of the body modulates activity underlying sustained tactile-spatial attention, facilitating attentional modulation of tactile processing in between-hand (when hands are sufficiently far apart) and impairing attentional modulation in within-hand selection tasks. Finally, body posture influences mechanisms of sustained tactile-spatial attention, relying, at least partly, on remapping of tactile stimuli in external, visuallydefined, spatial coordinates. Taken together, the findings reviewed in this paper indicate that sustained spatial attention in touch is subserved by both modality-specific and multimodal mechanisms. The interplay between these mechanisms allows flexible and efficient spatial selection within and across sensory modalities.


2013 ◽  
Vol 26 (1-2) ◽  
pp. 19-51 ◽  
Author(s):  
Yushi Jiang ◽  
Lihan Chen

Intra-modal apparent motion has been shown to be affected or ‘captured’ by information from another, task-irrelevant modality, as shown in cross-modal dynamic capture effect. Here we created inter-modal apparent motion between visual and tactile stimuli and investigated whether there are mutual influences between auditory apparent motion and inter-modal visual/tactile apparent motion. Moreover, we examined whether and how the spatial remapping between somatotopic and external reference frames of tactile events affect the cross-modal capture between auditory apparent motion and inter-modal visual/tactile apparent motion, by introducing two arm postures: arms-uncrossed and arms-crossed. In Experiment 1, we used auditory stimuli (auditory apparent motion) as distractors and inter-modal visual/tactile stimuli (inter-modal apparent motion) as targets while in Experiment 2 we reversed the distractors and targets. In Experiment 1, we found a general detrimental influence of arms-crossed posture in the task of discrimination of direction in visual/tactile stream, but in Experiment 2, the influence of arms-uncrossed posture played a significant role in modulating the inter-modal visual/tactile stimuli capturing over auditory apparent motion. In both Experiments, the synchronously presented motion streams led to noticeable directional congruency effect in judging the target motion. Among the different modality combinations, tactile to tactile apparent motion (TT) and visual to visual apparent motion (VV) are two signatures revealing the asymmetric congruency effects. When the auditory stimuli were targets, the congruency effect was largest with VV distractors, lowest with TT distractors; the pattern was reversed when the auditory stimuli were distractors. In addition, across both experiments the congruency effect in visual to tactile (VT) and tactile to visual (TV) apparent motion was intermediate between the effect-sizes in VV and TT. We replicated the above findings with a block-wise design (Experiment 3). In Experiment 4, we introduced static distractor events (visual or tactile stimulus), and found the modulation of spatial remapping of distractors upon AA motion is reduced. These findings suggest that there are mutual but a robust asymmetric influence between intra-modal auditory apparent motion and intermodal visual/tactile apparent motion. We proposed that relative reliabilities in directional information between distractor and target streams, summed over a remapping process between two spatial reference frames, determined this asymmetric influence.


Perception ◽  
2018 ◽  
Vol 47 (5) ◽  
pp. 507-520 ◽  
Author(s):  
Valéry Legrain ◽  
Louise Manfron ◽  
Marynn Garcia ◽  
Lieve Filbrich

How we perceive our body is shaped by sensory experiences with our surrounding environment, as witnessed by poor performance in tasks during which participants judge with their hands crossed the temporal order between two somatosensory stimuli, one applied on each hand. This suggests that somatosensory stimuli are not only processed according to a somatotopic representation but also a spatiotopic representation of the body. We investigated whether the perception of stimuli occurring in external space, such as visual stimuli, can also be influenced by the body posture and somatosensory stimuli. Participants performed temporal order judgements on pairs of visual stimuli, one in each side of space, with their hands uncrossed or crossed. In Experiment 1, participants’ hands were placed either near or far from the visual stimuli. In Experiment 2, the visual stimuli were preceded, either by 60 ms or 360 ms, by tactile stimuli applied on the hands placed near the visual stimuli. Manipulating the time interval was intended to activate either a somatotopic or a spatiotopic representation of somatic inputs. We did not obtain any evidence for an influence of body posture on visual temporal order judgment, suggesting that body perception is less relevant for processing extrabody stimuli than the reverse.


Sign in / Sign up

Export Citation Format

Share Document