Just-noticeable difference in the speed of cyclopean motion in depth and the speed of cyclopean motion within a frontoparallel plane.

Author(s):  
Christine V. Portfors ◽  
David Regan
Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 138-138
Author(s):  
G Ishimura

Transversal hand action in the frontoparallel plane biases the perception of bistable visual motion. This has been called action capture. In daily behaviour, however, hand action in a ‘radial’ direction from the head might be more important, because we frequently reach our hand for an object in front of us while guiding the action with vision. The purpose of this study was to measure the strength of action capture in the radial direction. Horizontal luminance gratings were placed above and below the fixation point. Binocular disparity, perspective contour, and spatial frequency gradient cues were attached to the gratings so that they simulated the ‘ceiling’ and the ‘floor’ of a long corridor. The display was reflected on a tilted mirror to face upward. The subject looked into the display and moved his/her dominant hand toward, or away from, the face behind the mirror. Just after the action onset, detected by the computer, one of the two gratings (the ceiling or the floor) flickered in short period to simulate bistable visual motion in depth (approaching or departing). The subject indicated the perceived motion direction in the frontoparallel plane using a 2AFC (upward or downward) method. The results showed that perceived motion was significantly biased to the ‘departing’ direction when the hand moved ‘away from’ the face, and it was biased to the ‘approaching’ direction when the hand moved ‘toward’ it. It is concluded that action capture occurs not only in transversal but also in radial movements.


Perception ◽  
1982 ◽  
Vol 11 (2) ◽  
pp. 187-199 ◽  
Author(s):  
Walter C Gogel ◽  
Bernard W Griffin

Induced motion is not limited to continuous motions presented on a frontoparallel plane. Experiments were conducted to investigate several varieties of induced motion to which theories of induced motion must apply. The observer indicated the perceived path of motion of a vertically moving test point to which induced motion at right angles to the physical motion was added by the motion of two inducing points. In experiment 1 all motions (both apparently and physically) were in a frontoparallel plane. It was found that discrete displacement as well as continuous motion of the test and inducing points produced substantial amounts of induction. In experiment 2 the inducing points were continuously moved in stereoscopic distance rather than remaining in an apparent frontoparallel plane. A large amount of apparent motion in depth was found in the vertically moving test point and was interpreted as an induced motion in depth. In experiment 3 an alternative interpretation of the phenomenon of experiment 2, in terms of an apparent vergence for the two images of the test point, was investigated and found to be unlikely. In experiment 4, with all the points moving continuously in a frontoparallel plane, eye motions as well as induced motions were measured, with the observer fixating either the test point or an inducing point. Substantial amounts of induction were obtained under both conditions of fixation. The consequences of these findings for theories of induced motion are discussed.


2008 ◽  
Vol 128 (7) ◽  
pp. 1015-1022
Author(s):  
Sheng Ge ◽  
Makoto Ichikawa ◽  
Atsushi Osa ◽  
Keiji Iramina ◽  
Hidetoshi Miike

2019 ◽  
Vol 2019 (1) ◽  
pp. 80-85
Author(s):  
Pooshpanjan Roy Biswas ◽  
Alessandro Beltrami ◽  
Joan Saez Gomez

To reproduce colors in one system which differs from another system in terms of the color gamut, it is necessary to use a color gamut mapping process. This color gamut mapping is a method to translate a specific color from a medium (screen, digital camera, scanner, digital file, etc) into another system having a difference in gamut volume. There are different rendering intent options defined by the International Color Consortium [5] to use the different reproduction goals of the user [19]. Any rendering intent used to reproduce colors, includes profile engine decisions to do it, i.e. looking for color accuracy, vivid colors or pleasing reproduction of images. Using the same decisions on different profile engines, the final visual output can look different (more than one Just Noticeable Difference[16]) depending on the profile engine used and the color algorithms that they implement. Profile performance substantially depends on the profiler engine used to create them. Different profilers provide the user with varying levels of liberty to design a profile for their color management needs and preference. The motivation of this study is to rank the performance of various market leading profiler engines on the basis of different metrics designed specifically to report the performance of particular aspects of these profiles. The study helped us take valuable decisions regarding profile performance without any visual assessment to decide on the best profiler engine.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Nadia Paraskevoudi ◽  
Iria SanMiguel

AbstractThe ability to distinguish self-generated stimuli from those caused by external sources is critical for all behaving organisms. Although many studies point to a sensory attenuation of self-generated stimuli, recent evidence suggests that motor actions can result in either attenuated or enhanced perceptual processing depending on the environmental context (i.e., stimulus intensity). The present study employed 2-AFC sound detection and loudness discrimination tasks to test whether sound source (self- or externally-generated) and stimulus intensity (supra- or near-threshold) interactively modulate detection ability and loudness perception. Self-generation did not affect detection and discrimination sensitivity (i.e., detection thresholds and Just Noticeable Difference, respectively). However, in the discrimination task, we observed a significant interaction between self-generation and intensity on perceptual bias (i.e. Point of Subjective Equality). Supra-threshold self-generated sounds were perceived softer than externally-generated ones, while at near-threshold intensities self-generated sounds were perceived louder than externally-generated ones. Our findings provide empirical support to recent theories on how predictions and signal intensity modulate perceptual processing, pointing to interactive effects of intensity and self-generation that seem to be driven by a biased estimate of perceived loudness, rather by changes in detection and discrimination sensitivity.


Perception ◽  
1995 ◽  
Vol 24 (9) ◽  
pp. 995-1010 ◽  
Author(s):  
Emiel Reith ◽  
Chang Hong Liu

Adult subjects drew the visual projection of two models. One model was a trapezoid placed in the frontoparallel plane. The other was a tilted rectangle which displayed the same projective shape on a frontoparallel plane as the trapezoid. The drawing conditions were varied in two ways: the model remained available for inspection during the drawing task or it was masked after initial inspection; the subjects drew on paper placed flat on the table or on a vertical glass pane placed in front of the model (ie on a da Vinci window). The results were that (i) the projective shape of the frontoparallel trapezoid was reproduced accurately whereas that of the tilted rectangle was systematically distorted in the direction of its actual physical dimensions; (ii) when subjects drew on paper, the presence or absence of a view of the model made no difference to the amount of distortion; (iii) drawing on a da Vinci window improved accuracy even when the model was hidden. These findings provide information about the relative roles of object-centred knowledge, perceptual abilities, and depiction skills in drawing performance.


1977 ◽  
Vol 23 (6) ◽  
pp. 606-611
Author(s):  
Shlomo Globerson

2015 ◽  
Vol 19 (5) ◽  
pp. 1044-1052 ◽  
Author(s):  
Lu-Lu Zhang ◽  
Lei Zhao ◽  
Hou-Yin Wang ◽  
Rui-Cong Zhi ◽  
Bo-Lin Shi ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document