tactile cues
Recently Published Documents


TOTAL DOCUMENTS

187
(FIVE YEARS 58)

H-INDEX

19
(FIVE YEARS 3)

2021 ◽  
Author(s):  
Caitlin Elisabeth Naylor ◽  
David Harris ◽  
Samuel James Vine ◽  
Jack Brookes ◽  
Faisal Mushtaq ◽  
...  

The integration of visual and tactile cues can enhance perception. However, the nature of this integration, and the subsequent benefits on perception and action execution, are context-dependent. Here, we examined how visual-tactile integration can influence performance on a complex motor task using virtual reality. We asked participants to wear a VR head-mounted display while using a tracked physical putter to make golf putts on a VR golf course in two conditions. In the ‘tactile’ condition, putter contact with the virtual golf ball coincided with physical contact with a physical ball. In a second ‘no tactile’ condition, no physical ball was present, such that only the virtual ball contacted the putter. In contrast to our pre-registered prediction that performance would benefit from the integration of visual and tactile cues, we found golf putting accuracy was higher in the no tactile condition compared to the tactile condition. Participants exhibited higher lateral error variance and over/undershooting when the physical ball was present. These differences in performance between the conditions suggest that tactile cues, when available, were integrated with visual cues. Second, this integration is not necessarily beneficial to performance. We suggest that the decreased performance caused by the addition of a physical ball may have been due to minor incongruencies between the virtual visual cues and the physical tactile cues. We discuss the implications of these results on the use of VR sports training and highlight that the absence of matched tactile cues in VR can result in sub-optimal learning and performance.


Behaviour ◽  
2021 ◽  
pp. 1-13
Author(s):  
Isamara Mendes-Silva ◽  
Drielly Queiroga ◽  
Eduardo S. Calixto ◽  
Helena M. Torezan-Silingardi ◽  
Kleber Del-Claro

Abstract Predatory social wasps are well studied in several aspects; however, foraging behaviour, especially that which takes place away from the nest at often unpredictable locations, or specialized behaviours to find and subdue prey are not well understood. In the Brazilian tropical savanna, the Polistinae wasp Brachygastra lecheguana is specialized in preying on some endophytic weevil larvae which develops inside floral buds. We hypothesized that these wasps utilize a combination of different mechanisms such as visual, chemical (odour) and possible tactile cues to find the weevil larvae. Using a combination of experimental manipulations (visual; chemical; visual/chemical) we tested the wasp’s ability to detect the endophytic larvae in the field. Additionally, we checked the ability of this wasp to detect vibrations produced by the weevils inside the buds. Our results suggest that the B. lecheguana wasp utilizes a sequence of eco-physiological mechanisms to find the endophytic larva inside floral buds: sight, smell, and perhaps touch. The use of multiple cues by this wasp guarantees such a high rate of predation on endophytic beetles that the wasp may have positive implications (reduction in weevils’ infestation) for the future of the host plant’s reproduction.


Author(s):  
Caitlin Elisabeth Naylor ◽  
Michael J Proulx ◽  
Gavin Buckingham

AbstractThe material-weight illusion (MWI) demonstrates how our past experience with material and weight can create expectations that influence the perceived heaviness of an object. Here we used mixed-reality to place touch and vision in conflict, to investigate whether the modality through which materials are presented to a lifter could influence the top-down perceptual processes driving the MWI. University students lifted equally-weighted polystyrene, cork and granite cubes whilst viewing computer-generated images of the cubes in virtual reality (VR). This allowed the visual and tactile material cues to be altered, whilst all other object properties were kept constant. Representation of the objects’ material in VR was manipulated to create four sensory conditions: visual-tactile matched, visual-tactile mismatched, visual differences only and tactile differences only. A robust MWI was induced across all sensory conditions, whereby the polystyrene object felt heavier than the granite object. The strength of the MWI differed across conditions, with tactile material cues having a stronger influence on perceived heaviness than visual material cues. We discuss how these results suggest a mechanism whereby multisensory integration directly impacts how top-down processes shape perception.


2021 ◽  
Vol 118 (49) ◽  
pp. e2109109118
Author(s):  
Laurence Willemet ◽  
Khoubeib Kanzari ◽  
Jocelyn Monnoyer ◽  
Ingvars Birznieks ◽  
Michaël Wiertlewski

Humans efficiently estimate the grip force necessary to lift a variety of objects, including slippery ones. The regulation of grip force starts with the initial contact and takes into account the surface properties, such as friction. This estimation of the frictional strength has been shown to depend critically on cutaneous information. However, the physical and perceptual mechanism that provides such early tactile information remains elusive. In this study, we developed a friction-modulation apparatus to elucidate the effects of the frictional properties of objects during initial contact. We found a correlation between participants’ conscious perception of friction and radial strain patterns of skin deformation. The results provide insights into the tactile cues made available by contact mechanics to the sensorimotor regulation of grip, as well as to the conscious perception of the frictional properties of an object.


PLoS ONE ◽  
2021 ◽  
Vol 16 (11) ◽  
pp. e0259988
Author(s):  
Annie A. Butler ◽  
Lucy S. Robertson ◽  
Audrey P. Wang ◽  
Simon C. Gandevia ◽  
Martin E. Héroux

Passively grasping an unseen artificial finger induces ownership over this finger and an illusory coming together of one’s index fingers: a grasp illusion. Here we determine how interoceptive ability and attending to the upper limbs influence this illusion. Participants passively grasped an unseen artificial finger with their left index finger and thumb for 3 min while their right index finger, located 12 cm below, was lightly clamped. Experiment 1 (n = 30) investigated whether the strength of the grasp illusion (perceived index finger spacing and perceived ownership) is related to a person’s level of interoceptive accuracy (modified heartbeat counting task) and sensibility (Noticing subscale of the Multidimensional Assessment of Interoceptive Awareness). Experiment 2 (n = 30) investigated the effect of providing verbal or tactile cues to guide participants’ attention to their upper limbs. On their own, neither interoceptive accuracy and sensibility or verbal and tactile cueing had an effect on the grasp illusion. However, verbal cueing increased the strength of the grasp illusion in individuals with lower interoceptive ability. Across the observed range of interoceptive accuracy and sensibility, verbal cueing decreased perceived index spacing by 5.6 cm [1.91 to 9.38] (mean [95%CI]), and perceived ownership by ∼3 points on a 7-point Likert scale (slope -0.93 [-1.72 to -0.15]). Thus, attending to the upper limbs via verbal cues increases the strength of the grasp illusion in a way that is inversely proportional to a person’s level of interoceptive accuracy and sensibility.


2021 ◽  
pp. 204-223
Author(s):  
John M. Reynolds

In a current academic architectural design culture often characterized by parametricism, cybernetics, and virtual reality, architectural design’s visceral and haptic dimensions have assumed an inferior position in architectural design ideation and process. Whether through intentional dismissal or benign neglect, the design process pursued in many undergraduate architectural programs has assumed an occularcentric modality. Students ideate in a weightless, immaterial, design landscape of simulation, devoid of the shifting qualities of color, light and shadow, the nuance of olfactory and tactile cues, and material resonance. Rather than advance a nostalgic, anti-digitally mediated position, the design practices described here deploy haptic means to tender a design process that advances a tactile tectonic grounded in the DNA or patterns of human experience and nature.


2021 ◽  
Author(s):  
Laurence Willemet ◽  
Khoubeib Kanzari ◽  
Jocelyn Monnoyer ◽  
Ingvars Birznieks ◽  
Michael Wiertlewski

Humans efficiently estimate the grip force necessary to lift a variety of objects, including slippery ones. The regulation of grip force starts with the initial contact, and takes into account the surface properties, such as friction. This estimation of the frictional strength has been shown to depend critically on cutaneous information. However, the physical and perceptual mechanism that provides such early tactile information remains elusive. In this study, we developed a friction-modulation apparatus to elucidate the effects of the frictional properties of objects during initial contact. We found a correlation between participants' conscious perception of friction and radial strain patterns of skin deformation. The results provide insights into the tactile cues made available by contact mechanics to the sensorimotor regulation of grip, as well as to the conscious perception of the frictional properties of an object.


2021 ◽  
Author(s):  
Erik Pescara ◽  
Anton Stubenbord ◽  
Tobias Röddiger ◽  
Likun Fang ◽  
Michael Beigl

2021 ◽  
pp. 1-22
Author(s):  
Brandy Murovec ◽  
Julia Spaniol ◽  
Jennifer L. Campos ◽  
Behrang Keshavarz

Abstract A critical component to many immersive experiences in virtual reality (VR) is vection, defined as the illusion of self-motion. Traditionally, vection has been described as a visual phenomenon, but more recent research suggests that vection can be influenced by a variety of senses. The goal of the present study was to investigate the role of multisensory cues on vection by manipulating the availability of visual, auditory, and tactile stimuli in a VR setting. To achieve this, 24 adults (Mage = 25.04) were presented with a rotating stimulus aimed to induce circular vection. All participants completed trials that included a single sensory cue, a combination of two cues, or all three cues presented together. The size of the field of view (FOV) was manipulated across four levels (no-visuals, small, medium, full). Participants rated vection intensity and duration verbally after each trial. Results showed that all three sensory cues induced vection when presented in isolation, with visual cues eliciting the highest intensity and longest duration. The presence of auditory and tactile cues further increased vection intensity and duration compared to conditions where these cues were not presented. These findings support the idea that vection can be induced via multiple types of sensory inputs and can be intensified when multiple sensory inputs are combined.


Sign in / Sign up

Export Citation Format

Share Document