scholarly journals Directional biases in whole hand motion perception revealed by mid-air tactile stimulation

2020 ◽  
Author(s):  
Marlou N Perquin ◽  
Mason Taylor ◽  
Jarred Lorusso ◽  
James Kolasinski

AbstractHuman machine interfaces are increasingly designed to reduce our reliance on the dominantly used senses of vision and audition. Many emerging technologies are attempting to convey complex spatiotemporal information via tactile percepts shown to be effective in the visual domain, such as shape and motion. Despite the intuitive appeal of touch as a method of feedback, we do not know to what extent the hand can substitute for the retina in this way. Here we ask whether the tactile system can be used to perceive complex whole hand motion stimuli, and whether it exhibits the same kind of established perceptual biases as reported in the visual domain. Using ultrasound stimulation, we were able to project complex moving dot percepts onto the palm in mid-air, over 30cm above an emitter device. We generated dot kinetogram stimuli involving motion in three different directional axes (‘Horizontal’, ‘Vertical’, and ‘Oblique’) on the ventral surface of the hand. We found clear evidence that participants were able to discriminate tactile motion direction. Furthermore, there was a marked directional bias in motion perception: participants were better and more confident at discriminating motion in the vertical and horizontal axes of the hand, compared to those stimuli moving obliquely. This pattern directly mirrors the perceptional biases that have been robustly reported in the visual field, termed the ‘Oblique Effect’. These data show the existence of biases in motion perception that transcend sensory modality. Furthermore, we extend the Oblique Effect to a whole hand scale, using motion stimuli presented on the broad and relatively low acuity surface of the palm, away from the densely innervated and much studied fingertips. These findings also highlight targeted ultrasound stimulation as a versatile means by which to convey potentially complex spatial and temporal information without the need for a user to wear or touch a device. This ability is particularly attractive as a potential feedback mechanism for application in contact-free human machine interfaces.

2021 ◽  
pp. 174702182199545
Author(s):  
Emily M Crowe ◽  
Sander A Los ◽  
Louise Schindler ◽  
Christopher Kent

How quickly participants respond to a “go” after a “warning” signal is partly determined by the time between the two signals (the foreperiod) and the distribution of foreperiods. According to Multiple Trace Theory of Temporal Preparation (MTP), participants use memory traces of previous foreperiods to prepare for the upcoming go signal. If the processes underlying temporal preparation reflect general encoding and memory principles, transfer effects (the carryover effect of a previous block’s distribution of foreperiods to the current block) should be observed regardless of the sensory modality in which signals are presented. Despite convincing evidence for transfer effects in the visual domain, only weak evidence for transfer effects has been documented in the auditory domain. Three experiments were conducted to examine whether such differences in results are due to the modality of the stimulus or other procedural factors. In each experiment, two groups of participants were exposed to different foreperiod distributions in the acquisition phase and to the same foreperiod distribution in the transfer phase. Experiment 1 used a choice-reaction time (RT) task, and the warning signal remained on until the go signal, but there was no evidence for transfer effects. Experiments 2 and 3 used a simple- and choice-RT task, respectively, and there was silence between the warning and go signals. Both experiments revealed evidence for transfer effects, which suggests that transfer effects are most evident when there is no auditory stimulation between the warning and go signals.


2010 ◽  
Vol 104 (6) ◽  
pp. 2913-2921 ◽  
Author(s):  
Jinsung Wang ◽  
J. Toby Mordkoff ◽  
Robert L. Sainburg

Bilateral interference, referring to the tendency of movements of one arm to disrupt the intended movements made simultaneously with the other arm, is often observed in a task that involves differential planning of each arm movement during sensorimotor adaptation. In the present study, we examined two questions: 1) how does the compatibility between visuomotor adaptation tasks performed with both arms affect bilateral interference during bimanual performance? and 2) how do variations in bilateral interference affect transfer of visuomotor adaptation between bilateral and unilateral conditions? To examine these questions, we manipulated visuomotor compatibility using two kinematic variables (direction of required hand motion, direction of an imposed visual rotation). Experiment 1 consisted of two conditions in which the direction of visual rotations for both arms was either in the same or opposing directions, whereas the target direction for both arms was always the same. In experiment 2, we examined the pattern of generalization between the bilateral and unilateral conditions when both the target and rotation directions were opposing between the arms. In both experiments, subjects first adapted to a 30° visual rotation with one arm (preunilateral), then with both arms (bilateral), and finally with the arm that was not used in the first session (postunilateral). Our results show that bilateral interference was smallest when both variables were the same between the arms. Our data also show extensive transfer of visuomotor adaptation between bilateral and unilateral conditions, regardless of degree of bilateral interference.


1993 ◽  
Vol 33 (18) ◽  
pp. 2747-2756 ◽  
Author(s):  
Nancy J. Coletta ◽  
Padhmalatha Segu ◽  
Carlo L.M. Tiana

Author(s):  
Xudong Zhang ◽  
Don B. Chaffin

In this paper we describe a new scheme for empirically investigating the effects of task factors on three-dimensional (3D) dynamic postures during seated reaching movements. The scheme relies on an underlying model that integrates two statistical procedures: (a) a regression description of the relationship between the time-varying hand location and postural angles to characterize the movement data and (b) a series of analyses of variance to test the hypothesized task effects using representative instantaneous postures. The use of this scheme is illustrated by an experiment that examines two generic task factors: hand motion direction and motion completion time. Results suggest that hand motion direction is a significant task factor in determining instantaneous postures, whereas a distinctive difference in the time to complete a motion does not appear to have a significant effect. We discuss the concept of an instantaneous posture and its utility in dynamic studies of movements, some insights into human reaching movement control strategy, and implications for the development of a 3D dynamic posture prediction model.


2020 ◽  
Vol 20 (1) ◽  
pp. 415-424 ◽  
Author(s):  
Shufan Wang ◽  
Limin Zhang ◽  
Fei Sun ◽  
Zining Dong ◽  
Feng Yan ◽  
...  

2019 ◽  
Vol 30 (4) ◽  
pp. 2659-2673
Author(s):  
Shaun L Cloherty ◽  
Jacob L Yates ◽  
Dina Graf ◽  
Gregory C DeAngelis ◽  
Jude F Mitchell

Abstract Visual motion processing is a well-established model system for studying neural population codes in primates. The common marmoset, a small new world primate, offers unparalleled opportunities to probe these population codes in key motion processing areas, such as cortical areas MT and MST, because these areas are accessible for imaging and recording at the cortical surface. However, little is currently known about the perceptual abilities of the marmoset. Here, we introduce a paradigm for studying motion perception in the marmoset and compare their psychophysical performance with human observers. We trained two marmosets to perform a motion estimation task in which they provided an analog report of their perceived direction of motion with an eye movement to a ring that surrounded the motion stimulus. Marmosets and humans exhibited similar trade-offs in speed versus accuracy: errors were larger and reaction times were longer as the strength of the motion signal was reduced. Reverse correlation on the temporal fluctuations in motion direction revealed that both species exhibited short integration windows; however, marmosets had substantially less nondecision time than humans. Our results provide the first quantification of motion perception in the marmoset and demonstrate several advantages to using analog estimation tasks.


Vision ◽  
2019 ◽  
Vol 3 (4) ◽  
pp. 64
Author(s):  
Martin Lages ◽  
Suzanne Heron

Like many predators, humans have forward-facing eyes that are set a short distance apart so that an extensive region of the visual field is seen from two different points of view. The human visual system can establish a three-dimensional (3D) percept from the projection of images into the left and right eye. How the visual system integrates local motion and binocular depth in order to accomplish 3D motion perception is still under investigation. Here, we propose a geometric-statistical model that combines noisy velocity constraints with a spherical motion prior to solve the aperture problem in 3D. In two psychophysical experiments, it is shown that instantiations of this model can explain how human observers disambiguate 3D line motion direction behind a circular aperture. We discuss the implications of our results for the processing of motion and dynamic depth in the visual system.


Sign in / Sign up

Export Citation Format

Share Document