scholarly journals Subjective zeros, subjectively equal stimulus spacing, and contraction biases in very first judgments of lightness

1985 ◽  
Vol 37 (5) ◽  
pp. 420-428 ◽  
Author(s):  
E. C. Poulton ◽  
D. C. V. Simmonds
Keyword(s):  

1965 ◽  
Vol 21 (1) ◽  
pp. 151-156 ◽  
Author(s):  
Jacob Beck ◽  
William A. Shaw

Experiments studied magnitude estimations of loudness as a function of the magnitude of the standard, the numerical value of the standard and the stimulus spacing. The results indicate that (a) the magnitude and numerical value of the standard may have marked effects on the judgment of loudnesses and (b) the distribution of the stimuli does not produce notable effects in the over-all form of the loudness function. Several possible sources of bias were investigated.



2015 ◽  
Vol 40 (4) ◽  
pp. 585-594
Author(s):  
Sławomir Zieliński

Abstract The multi-stimulus test with hidden reference and anchors (MUSHRA) is commonly used for subjective quality assessment of audio systems. Despite its wide acceptance in scientific and industrial sectors, the method is not free from bias. One possible source of bias in the MUSHRA method may be attributed to a graphical design of its user interface. This paper examines the hypothesis that replacement of the standard multi-slider layout with a single-slider version could reduce a stimulus spacing bias observed in the MUSHRA test. Contrary to the expectation, the aforementioned modification did not reduce the bias. This outcome formally supports the validity of using multiple sliders in the MUSHRA graphical interface.



1967 ◽  
Vol 25 (3) ◽  
pp. 797-802
Author(s):  
Donald A. Shurtleff ◽  
David I. Mostofsky

Stimulus discrimination gradients were compared for conditions in which intervening (test) stimuli were spaced equally between the positive (S+) and negative (S−) stimuli and when test stimuli were spaced unequally between S+ and S−. When the gradients are plotted as a function of the physical units of the stimulus dimension, no differences attributable to stimulus spacing are evident. The results further suggested that the Ss responded to intervening stimuli according to their arrangement on an interval scale rather than to their arrangement on an ordinal scale.



1999 ◽  
Vol 27 (4) ◽  
pp. 472-480 ◽  
Author(s):  
John M. Hinson ◽  
Cari B. Cannon
Keyword(s):  


1958 ◽  
Vol 56 (3) ◽  
pp. 246-250 ◽  
Author(s):  
Joseph C. Stevens
Keyword(s):  


1996 ◽  
Vol 49 (1b) ◽  
pp. 24-44 ◽  
Author(s):  
J.H. Wearden ◽  
A. Ferrara

Two experiments with human subjects, using short-duration tones as stimuli to be judged, investigated the effect of the range of the stimulus set on temporal bisection performance. In Experiment 1, six groups of subjects were tested on a temporal bisection task, where each stimulus had to be classified as “short” or “long”. For three groups, the difference between the longest (L) and shortest (S) durations in the to-be-bisected stimulus set was kept constant at 400 msec, and the L / S ratio was varied over values of 5:1 and 2:1. For three other groups, the L/S ratio was kept constant at 4:1 but the L-S difference varied from 300 to 600 msec. The bisection point (the stimulus value resulting in 50% “long” responses) was located closer to the arithmetic mean of L and S than the geometric mean for all groups except that for which the L / S ratio was 2:1, in which case geometric mean bisection was found. In Experiment 2, stimuli were spaced between L and S either linearly or logarithmically, and the L / S ratio took values of either 2:1 or 19:1. Geometric mean bisection was found in both cases when the L / S ratio was 2:1, but effects of stimulus spacing were found only when the L / S ratio was 19:1. Overall, the results supported a previous conjecture that the L / S ratio used in a bisection task played a critical role in determining the behaviour obtained. A theoretical model of bisection advanced by Wearden (1991) dealt appropriately with bisection point shifts discussed above but encountered difficulties with stimulus spacing effects.



2020 ◽  
Author(s):  
Andra Mihali ◽  
Wei Ji Ma

AbstractVisual search is one of the most ecologically important perceptual task domains. One research tradition has studied visual search using simple, parametric stimuli and a signal detection theory or Bayesian modeling framework. However, this tradition has mostly focused on homogeneous distractors (identical to each other), which are not very realistic. In a different tradition, Duncan and Humphreys (1989) conducted a landmark study on visual search with heterogeneous distractors. However, they used complex stimuli, making modeling and dissociation of component processes difficult. Here, we attempt to unify these research traditions by systematically examining visual search with heterogeneous distractors using simple, parametric stimuli and Bayesian modeling. Our experiment varied multiple factors that could influence performance: set size, task (N-AFC localization vs detection), whether the target was revealed before or after the search array (perception versus memory), and stimulus spacing. We found that performance robustly decreased with increasing set size. When examining within-trial summary statistics, we found that the minimum target-to-distractor feature difference was a stronger predictor of behavior than the mean target-to-distractor difference and than distractor variance. To obtain process-level understanding, we formulated a Bayesian optimal-observer model. This model accounted for all summary statistics, including when fitted jointly to localization and detection. We replicated these results in a separate experiment with reduced stimulus spacing. Together, our results represent a critique of Duncan and Humphrey’s descriptive approach, bring visual search with heterogeneous distractors firmly within the reach of quantitative process models, and affirm the “unreasonable effectiveness” of Bayesian models in explaining visual search.



2008 ◽  
Vol 73 (3) ◽  
pp. 308-316 ◽  
Author(s):  
Christopher Donkin ◽  
Scott D. Brown ◽  
Andrew Heathcote ◽  
A. A. J. Marley


Sign in / Sign up

Export Citation Format

Share Document