An adaptive procedure for categorical loudness scaling

2002 ◽  
Vol 112 (4) ◽  
pp. 1597-1604 ◽  
Author(s):  
Thomas Brand ◽  
Volker Hohmann
2015 ◽  
Vol 104 (14) ◽  
pp. 1-8 ◽  
Author(s):  
Francesea Barbagallo ◽  
Melina Bosco ◽  
Aurelio Ghersi ◽  
Edoardo M. Marino

2017 ◽  
Vol 142 (6) ◽  
pp. 3660-3669 ◽  
Author(s):  
Marcin Wróblewski ◽  
Daniel M. Rasetshwane ◽  
Stephen T. Neely ◽  
Walt Jesteadt

1974 ◽  
Vol 56 (S1) ◽  
pp. S16-S16
Author(s):  
J. B. Allen ◽  
D. A. Berkley ◽  
T. H. Curtis

Author(s):  
Michael Gineste ◽  
Jo Eidsvik

AbstractAn ensemble-based method for seismic inversion to estimate elastic attributes is considered, namely the iterative ensemble Kalman smoother. The main focus of this work is the challenge associated with ensemble-based inversion of seismic waveform data. The amount of seismic data is large and, depending on ensemble size, it cannot be processed in a single batch. Instead a solution strategy of partitioning the data recordings in time windows and processing these sequentially is suggested. This work demonstrates how this partitioning can be done adaptively, with a focus on reliable and efficient estimation. The adaptivity relies on an analysis of the update direction used in the iterative procedure, and an interpretation of contributions from prior and likelihood to this update. The idea is that these must balance; if the prior dominates, the estimation process is inefficient while the estimation is likely to overfit and diverge if data dominates. Two approaches to meet this balance are formulated and evaluated. One is based on an interpretation of eigenvalue distributions and how this enters and affects weighting of prior and likelihood contributions. The other is based on balancing the norm magnitude of prior and likelihood vector components in the update. Only the latter is found to sufficiently regularize the data window. Although no guarantees for avoiding ensemble divergence are provided in the paper, the results of the adaptive procedure indicate that robust estimation performance can be achieved for ensemble-based inversion of seismic waveform data.


2005 ◽  
Vol 23 (2) ◽  
pp. 165-188 ◽  
Author(s):  
Bruno H. Repp

THE RELATIVE DIFFICULTY of on-beat and off-beat finger tapping with simple auditory rhythms was assessed in four experiments with musically trained participants. The rhythms consisted of cyclically repeated TT0 or TTT0 patterns, where T denotes the presence and 0 denotes the absence of a tone. The tasks were to tap in synchrony with one of the T ("on-beat") positions or with the 0 ("off-beat") position. Experiments 1-3 used an adaptive procedure that determined the fastest tempo at which each task could be accomplished. Experiment 1 demonstrated that it is easier to tap on tones that carry a rhythmic grouping accent (T2 in TT0, T1 and T3 in TTT0) than on other tones or in the 0 position. Off-beat tapping was more difficult in TT0 than in TTT0 sequences. Experiment 2 showed that a dynamic ( pitch) accent on one of the tones facilitates synchronization with that tone and impedes synchronization with adjacent tones. Off-beat tapping was less affected by accent location. Experiment 3 required participants to "hear" different T positions as metrically accented (i.e., to construe them as the downbeat) while carrying out the various tapping tasks. Most participants found it difficult to maintain a cognitive downbeat at fast tempi when it did not coincide with their taps. However, when such a downbeat could be maintained, it did not seem to increase the difficulty of tapping (with one exception). This suggests a unidirectional dependence of metrical structure on action. In Experiment 4, the same tasks were presented at more moderate tempi, and the dependent measure was the variability of asynchronies. Metrical downbeat location still did not have any significant effect. Thus, synchronization difficulty seems to be affected only by a rhythm's physical structure, not by the cognitive interpretation that is given to that structure.


2000 ◽  
Vol 25 (1) ◽  
pp. 60-83 ◽  
Author(s):  
Yoav Benjamini ◽  
Yosef Hochberg

A new approach to problems of multiple significance testing was presented in Benjamini and Hochberg (1995), which calls for controlling the expected ratio of the number of erroneous rejections to the number of rejections–the False Discovery Rate (FDR). The procedure given there was shown to control the FDR for independent test statistics. When some of the hypotheses are in fact false, that procedure is too conservative. We present here an adaptive procedure, where the number of true null hypotheses is estimated first as in Hochberg and Benjamini (1990), and this estimate is used in the procedure of Benjamini and Hochberg (1995). The result is still a simple stepwise procedure, to which we also give a graphical companion. The new procedure is used in several examples drawn from educational and behavioral studies, addressing problems in multi-center studies, subset analysis and meta-analysis. The examples vary in the number of hypotheses tested, and the implication of the new procedure on the conclusions. In a large simulation study of independent test statistics the adaptive procedure is shown to control the FDR and have substantially better power than the previously suggested FDR controlling method, which by itself is more powerful than the traditional family wise error-rate controlling methods. In cases where most of the tested hypotheses are far from being true there is hardly any penalty due to the simultaneous testing of many hypotheses.


Sign in / Sign up

Export Citation Format

Share Document