frontoparallel plane
Recently Published Documents


TOTAL DOCUMENTS

47
(FIVE YEARS 1)

H-INDEX

12
(FIVE YEARS 0)

2019 ◽  
Vol 154 ◽  
pp. 97-104
Author(s):  
Ekaterina Koshmanova ◽  
Tadamasa Sawada
Keyword(s):  

2015 ◽  
Vol 113 (9) ◽  
pp. 3197-3208 ◽  
Author(s):  
Bernhard J. M. Hess ◽  
H. Misslisch

We have analyzed the three-dimensional spatiotemporal characteristics of saccadic refixations between far and near targets in three behaviorally trained rhesus monkeys. The kinematics underlying these rapid eye movements can be accurately described by rotations of the eyes in four different planes, namely, first disconjugate rotations in the horizontal plane of regard converging the eyes toward the near target, followed by rotations in each eye's vertical direction plane, and finally, disconjugate rotations in a common frontoparallel plane. This compounded rotation of the eye was underlying an initially fast-rising variable torsion that typically overshot the final torsion, which the eyes attained at the time of target acquisition. The torsion consisted of a coarse, widely varying component of opposite polarity in the two eyes, which contained a more robust, much smaller modulation that sharply increased toward the end of saccades. The reorientation of the eyes in torsion depended on each eye's azimuth, elevation, and target distance. We conclude that refixation saccades are generated by motor commands that control ocular torsion in concert with the saccade generator, which operates in Donders-Listing kinematics underlying Listing's law.


2013 ◽  
Vol 26 (3) ◽  
pp. 205-239 ◽  
Author(s):  
Adam Y. Shavit ◽  
Wenxun Li ◽  
Leonard Matin

The frontoparallel orientation of a long peripheral line influences two visual norms, elevation, also called the visual perception of eye level (VPEL), and orientation in the frontoparallel plane, called visually perceived vertical (VPV). However, VPEL and VPV are distinct in that different integration rules describe the combinatorial effects of two lines symmetrically located on opposite sides of the median plane. Nevertheless, we propose that the same orientation-sensitive process underlies the two discriminations. We measured the two norms while we manipulated visual orientation with 1-line and 2-line stimuli (on opposite sides of the median plane), then modeled the large and significant effect of line orientation on VPEL and VPV settings as linear averages of signals from vision and from non-visual, body-referenced, vestibular and proprioceptive mechanisms. Significant correlations are evident between observers () in the effect of visual orientation on both VPEL and VPV, and in the baseline measures (dark value, intercept) on both norms. The latter egocentric bias is further discussed in the context of the operation of the body-referenced mechanism across different egocentric discriminations for an individual subject. Given the evidence for different integration rules, the pattern of individual co-variation implies the existence of a single, shared visual orientation process that feeds to separate integration processes.


2010 ◽  
Vol 20 (04) ◽  
pp. 267-278 ◽  
Author(s):  
NIKOLAY CHUMERIN ◽  
AGOSTINO GIBALDI ◽  
SILVIO P. SABATINI ◽  
MARC M. VAN HULLE

We present two neural models for vergence angle control of a robotic head, a simplified and a more complex one. Both models work in a closed-loop manner and do not rely on explicitly computed disparity, but extract the desired vergence angle from the post-processed response of a population of disparity tuned complex cells, the actual gaze direction and the actual vergence angle. The first model assumes that the gaze direction of the robotic head is orthogonal to its baseline and the stimulus is a frontoparallel plane orthogonal to the gaze direction. The second model goes beyond these assumptions, and operates reliably in the general case where all restrictions on the orientation of the gaze, as well as the stimulus position, type and orientation, are dropped.


2010 ◽  
Vol 1 (3) ◽  
pp. 324-324
Author(s):  
V. Cornilleau-Peres ◽  
L.C. Tai ◽  
L. -F. Cheong

Perception ◽  
10.1068/p5641 ◽  
2007 ◽  
Vol 36 (7) ◽  
pp. 980-989 ◽  
Author(s):  
J Farley Norman ◽  
Elizabeth Y Wiesemann ◽  
Hideko F Norman ◽  
M Jett Taylor ◽  
Warren D Craft

The sensitivity of observers to nonrigid bending was evaluated in two experiments. In both experiments, observers were required to discriminate on any given trial which of two bending rods was more elastic. In experiment 1, both rods bent within the same oriented plane, and bent either in a frontoparallel plane or bent in depth. In experiment 2, the two rods within any given trial bent in different, randomly chosen orientations in depth. The results of both experiments revealed that human observers are sensitive to, and can reliably detect, relatively small differences in bending (the average Weber fraction across experiments 1 and 2 was 9.0%). The performance of the human observers was compared to that of models that based their elasticity judgments upon either static projected curvature or mean and maximal projected speed. Despite the fact that all of the observers reported compelling 3-D perceptions of bending in depth, their judgments were both qualitatively and quantitatively consistent with the performance of the models. This similarity suggests that relatively straightforward information about the elasticity of simple bending objects is available in projected retinal images.


2007 ◽  
Vol 69 (2) ◽  
pp. 276-286 ◽  
Author(s):  
Robert Volcic ◽  
Astrid M. L. Kappers ◽  
Jan J. Koenderink

2006 ◽  
Vol 9 (2) ◽  
pp. 273-284 ◽  
Author(s):  
J. Antonio Aznar-Casanova ◽  
Elton H. Matsushima ◽  
Nilton P. Ribeiro-Filho ◽  
José A. Da Silva

The aim of this study is twofold: on the one hand, to determine how visual space, as assessed by exocentric distance estimates, is related to physical space. On the other hand, to determine the structure of visual space as assessed by exocentric distance estimates. Visual space was measured in three environments: (a) points located in a 2-D frontoparallel plane, covering a range of distances of 20 cm; (b) stakes placed in a 3-D virtual space (range ≈ 330 mm); and (c) stakes in a 3-D outdoors open field (range = 45 m). Observers made matching judgments of distances between all possible pairs of stimuli, obtained from 16 stimuli (in a regular squared 4 × 4 matrix). Two parameters from Stevens' power law informed us about the distortion of visual space: its exponent and its coefficient of determination (R2). The results showed a ranking of the magnitude of the distortions found in each experimental environment, and also provided information about the efficacy of available visual cues of spatial layout. Furthermore, our data are in agreement with previous findings showing systematic perceptual errors, such as the further the stimuli, the larger the distortion of the area subtended by perceived distances between stimuli. Additionally, we measured the magnitude of distortion of visual space relative to physical space by a parameter of multidimensional scaling analyses, the RMSE. From these results, the magnitude of such distortions can be ranked, and the utility or efficacy of the available visual cues informing about the space layout can also be inferred.


Sign in / Sign up

Export Citation Format

Share Document