slant perception
Recently Published Documents


TOTAL DOCUMENTS

80
(FIVE YEARS 5)

H-INDEX

20
(FIVE YEARS 1)

2020 ◽  
Vol 20 (11) ◽  
pp. 1004
Author(s):  
Zhongting Chen ◽  
Ping Yang ◽  
Jeffrey Saunders
Keyword(s):  

2020 ◽  
Vol 123 (4) ◽  
pp. 1407-1419
Author(s):  
Evan Cesanek ◽  
Jordan A. Taylor ◽  
Fulvio Domini

Visually guided movements can show surprising accuracy even when the perceived three-dimensional (3D) shape of the target is distorted. One explanation of this paradox is that an evolutionarily specialized “vision-for-action” system provides accurate shape estimates by relying selectively on stereo information and ignoring less reliable sources of shape information like texture and shading. However, the key support for this hypothesis has come from studies that analyze average behavior across many visuomotor interactions where available sensory feedback reinforces stereo information. The present study, which carefully accounts for the effects of feedback, shows that visuomotor interactions with slanted surfaces are actually planned using the same cue-combination function as slant perception and that apparent dissociations can arise due to two distinct supervised learning processes: sensorimotor adaptation and cue reweighting. In two experiments, we show that when a distorted slant cue biases perception (e.g., surfaces appear flattened by a fixed amount), sensorimotor adaptation rapidly adjusts the planned grip orientation to compensate for this constant error. However, when the distorted slant cue is unreliable, leading to variable errors across a set of objects (i.e., some slants are overestimated, others underestimated), then relative cue weights are gradually adjusted to reduce the misleading effect of the unreliable cue, consistent with previous perceptual studies of cue reweighting. The speed and flexibility of these two forms of learning provide an alternative explanation of why perception and action are sometimes found to be dissociated in experiments where some 3D shape cues are consistent with sensory feedback while others are faulty. NEW & NOTEWORTHY When interacting with three-dimensional (3D) objects, sensory feedback is available that could improve future performance via supervised learning. Here we confirm that natural visuomotor interactions lead to sensorimotor adaptation and cue reweighting, two distinct learning processes uniquely suited to resolve errors caused by biased and noisy 3D shape cues. These findings explain why perception and action are often found to be dissociated in experiments where some cues are consistent with sensory feedback while others are faulty.


2019 ◽  
Vol 63 (6) ◽  
pp. 60409-1-60409-11 ◽  
Author(s):  
Jonathan Tong ◽  
Robert S. Allison ◽  
Laurie M. Wilcox

Abstract Modern virtual reality (VR) headsets use lenses that distort the visual field, typically with distortion increasing with eccentricity. While content is pre-warped to counter this radial distortion, residual image distortions remain. Here we examine the extent to which such residual distortion impacts the perception of surface slant. In Experiment 1, we presented slanted surfaces in a head-mounted display and observers estimated the local surface slant at different locations. In Experiments 2 (slant estimation) and 3 (slant discrimination), we presented stimuli on a mirror stereoscope, which allowed us to more precisely control viewing and distortion parameters. Taken together, our results show that radial distortion has significant impact on perceived surface attitude, even following correction. Of the distortion levels we tested, 5% distortion results in significantly underestimated and less precise slant estimates relative to distortion-free surfaces. In contrast, Experiment 3 reveals that a level of 1% distortion is insufficient to produce significant changes in slant perception. Our results highlight the importance of adequately modeling and correcting lens distortion to improve VR user experience.


2019 ◽  
Vol 19 (10) ◽  
pp. 177d
Author(s):  
Pin Yang ◽  
Zhongting Chen ◽  
Jeffrey Allen Saunders
Keyword(s):  

2019 ◽  
Vol 19 (10) ◽  
pp. 222a
Author(s):  
Jonathan Tong ◽  
Robert S Allison ◽  
Laurie M Wilcox
Keyword(s):  

2018 ◽  
Vol 18 (10) ◽  
pp. 129
Author(s):  
Xiaoye Wang ◽  
Mats Lind ◽  
Geoffrey Bingham

2017 ◽  
Vol 17 (14) ◽  
pp. 4 ◽  
Author(s):  
Baptiste Caziot ◽  
Benjamin T. Backus ◽  
Esther Lin

Author(s):  
Samantha Horvath ◽  
Kori Macdonald ◽  
John Galeotti ◽  
Roberta L. Klatzky

Objective These studies used threshold and slant-matching tasks to assess and quantitatively measure human perception of 3-D planar images viewed through a stereomicroscope. The results are intended for use in developing augmented-reality surgical aids. Background Substantial research demonstrates that slant perception is performed with high accuracy from monocular and binocular cues, but less research concerns the effects of magnification. Viewing through a microscope affects the utility of monocular and stereo slant cues, but its impact is as yet unknown. Method Participants performed in a threshold slant-detection task and matched the slant of a tool to a surface. Different stimuli and monocular versus binocular viewing conditions were implemented to isolate stereo cues alone, stereo with perspective cues, accommodation cue only, and cues intrinsic to optical-coherence-tomography images. Results At magnification of 5x, slant thresholds with stimuli providing stereo cues approximated those reported for direct viewing, about 12°. Most participants (75%) who passed a stereoacuity pretest could match a tool to the slant of a surface viewed with stereo at 5x magnification, with mean compressive error of about 20% for optimized surfaces. Slant matching to optical coherence tomography images of the cornea viewed under the microscope was also demonstrated. Conclusion Despite the distortions and cue loss introduced by viewing under the stereomicroscope, most participants were able to detect and interact with slanted surfaces. Application The experiments demonstrated sensitivity to surface slant that supports the development of augmented-reality systems to aid microscope-aided surgery.


Sign in / Sign up

Export Citation Format

Share Document