scholarly journals On the Convergence of the Benjamini–Hochberg Procedure

Mathematics ◽  
2021 ◽  
Vol 9 (17) ◽  
pp. 2154
Author(s):  
Dean Palejev ◽  
Mladen Savov

The Benjamini–Hochberg procedure is one of the most used scientific methods up to date. It is widely used in the field of genetics and other areas where the problem of multiple comparison arises frequently. In this paper we show that under fairly general assumptions for the distribution of the test statistic under the alternative hypothesis, when increasing the number of tests, the power of the Benjamini–Hochberg procedure has an exponential type of asymptotic convergence to a previously shown limit of the power. We give a theoretical lower bound for the probability that for a fixed number of tests the power is within a given interval around its limit together with a software routine that calculates these values. This result is important when planning costly experiments and estimating the achieved power after performing them.

Symmetry ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 936
Author(s):  
Dan Wang

In this paper, a ratio test based on bootstrap approximation is proposed to detect the persistence change in heavy-tailed observations. This paper focuses on the symmetry testing problems of I(1)-to-I(0) and I(0)-to-I(1). On the basis of residual CUSUM, the test statistic is constructed in a ratio form. I prove the null distribution of the test statistic. The consistency under alternative hypothesis is also discussed. However, the null distribution of the test statistic contains an unknown tail index. To address this challenge, I present a bootstrap approximation method for determining the rejection region of this test. Simulation studies of artificial data are conducted to assess the finite sample performance, which shows that our method is better than the kernel method in all listed cases. The analysis of real data also demonstrates the excellent performance of this method.


1992 ◽  
Vol 02 (04) ◽  
pp. 363-372 ◽  
Author(s):  
E. BAMPIS ◽  
J.-C. KÖNIG ◽  
D. TRYSTRAM

We propose a new scheduling algorithm for a three-dimensional grid precedence graph of size n, and we prove that the communication overhead is just in Θ( log log n). Finally, we show that a lower bound for the overhead of any schedule with a fixed number of processors (p ≥ 3) is Ω( log log n).


2011 ◽  
Vol 22 (02) ◽  
pp. 395-409 ◽  
Author(s):  
HOLGER PETERSEN

We investigate the efficiency of simulations of storages by several counters. A simulation of a pushdown store is described which is optimal in the sense that reducing the number of counters of a simulator leads to an increase in time complexity. The lower bound also establishes a tight counter hierarchy in exponential time. Then we turn to simulations of a set of counters by a different number of counters. We improve and generalize a known simulation in polynomial time. Greibach has shown that adding s + 1 counters increases the power of machines working in time ns. Using a new family of languages we show here a tight hierarchy result for machines with the same polynomial time-bound. We also prove hierarchies for machines with a fixed number of counters and with growing polynomial time-bounds. For machines with one counter and an additional "store zero" instruction we establish the equivalence of real-time and linear time. If at least two counters are available, the classes of languages accepted in real-time and linear time can be separated.


1997 ◽  
Vol 1 (1) ◽  
pp. 13-25 ◽  
Author(s):  
J. C. W. Rayner ◽  
D. J. Best

The data for the tests considered here may be presented in two-way contingency tables with all marginal totals fixed. We show that Pearson's test statistic XP2 (P for Pearson) may be partitioned into useful and informative components. The first detects location differences be tween the treatments, and the subsequent components detect dispersion and higher order moment differences. For Kruskal-Wallis-type data when there are no ties, the location component is the Kruskal-Wallis test. The subsequent components are the extensions. Our approach enables us to generalise to when there are ties, and to when there is a fixed number of categories and a large number of observations. We also propose a generalisation of the well-known median test. In this situation the location-detecting first component of XP2 reduces to the usual median test statistic when there are only two categories. Subsequent components detect higher moment departures from the null hypothesis of equal treatment effects


NeuroImage ◽  
2000 ◽  
Vol 12 (2) ◽  
pp. 219-229 ◽  
Author(s):  
Federico Turkheimer ◽  
Karen Pettigrew ◽  
Louis Sokoloff ◽  
Carolyn Beebe Smith ◽  
Kathleen Schmidt

1994 ◽  
Vol 10 (1) ◽  
pp. 70-90 ◽  
Author(s):  
R.M. de Jong ◽  
H.J. Bierens

In this paper, a consistent model specification test is proposed. Some consistent model specification tests have been discussed in econometrics literature. Those tests are consistent by randomization, display a discontinuity in sample size, or have an asymptotic distribution that depends on the data-generating process and on the model, whereas our test does not have one of those disadvantages. Our test can be viewed upon as a conditional moment test as proposed by Newey but instead of a fixed number of conditional moments, an asymptotically infinite number of moment conditions is employed. The use of an asymptotically infinite number of conditional moments will make it possible to obtain a consistent test. Computation of the test statistic is particularly simple, since in finite samples our statistic is equivalent to a chi-square conditional moment test of a finite number of conditional moments.


Author(s):  
Adesola Adebayo AKANDE ◽  
Johnson Kolawole OLOWOOKERE

This paper examined forensic accounting capability and useful strategies to minimize the effects of financial crimes and other related frauds in Nigeria economy. Various points at which preventive controls can be established were identified.Data were collected from two hundred and fifty five (255) respondents in the south-west geo-political zone of Nigeria and response on capability and suitability of using forensic accounting principles and measures to prevent undue practice that often arises via financial crimes in the economy were analyzed with Analysis of Variance Method. The results showed that the estimated Z-statistic was 115.3736.The critical table value of the Z-statistic at 5% level of significance (95% confidence level) was obtained as 1.645. Since the estimated test statistic (Z-statistic) value exceeds the critical table value at 95% confidence level, the null hypothesis is rejected in favour of the alternative hypothesis. Thus, the study concludes that forensic Accounting principles is capable of eliminating all types of financial frauds in any economy. It was also empirically found that forensic accounting can be used to locate diverted funds or assets, identify misappropriated assets and identified reversible insider transactions, thus, forensic accounting is useful as effective fraud detection tool, detects suspicious fraudulent transactions and covers risk assessment processes. Based on the above the study recommends the adoption of forensic accounting principles, effective appropriation of financial resources and external auditors employment.


2009 ◽  
Vol 25 (1) ◽  
pp. 63-116 ◽  
Author(s):  
Alex Maynard ◽  
Katsumi Shimotsu

This paper develops a new test of orthogonality based on a zero restriction on the covariance between the dependent variable and the predictor. The test provides a useful alternative to regression-based tests when conditioning variables have roots close or equal to unity. In this case standard predictive regression tests can suffer from well-documented size distortion. Moreover, under the alternative hypothesis, they force the dependent variable to share the same order of integration as the predictor, whereas in practice the dependent variable often appears stationary and the predictor may be near-nonstationary. By contrast, the new test does not enforce the same orders of integration and is therefore capable of detecting a rich set of alternatives to orthogonality that are excluded by the standard predictive regression model. Moreover, the test statistic has a standard normal limit distribution for both unit root and local-to-unity conditioning variables, without prior knowledge of the local-to-unity parameter. If the conditioning variable is stationary, the test remains conservative and consistent. Simulations suggest good small-sample performance. As an empirical application, we test for the predictability of stock returns using two persistent predictors, the dividend-price ratio and short-term interest rate.


2011 ◽  
Vol 480-481 ◽  
pp. 775-780
Author(s):  
Ting Jun Li

The area of robust detection in the presence of partly unknown useful signal or interference is a widespread task in many signal processing applications. In this paper, we consider the robustness of a matched subspace detector in additive white Gaussian noise, under the condition that the noise power is known under null hypothesis, and unknown under alternative hypothesis when the useful signal triggers an variation of noise power, and we also consider the mismatch between the signal subspace and receiver matched filter. The test statistic of this detection problem is derived based on generalized likelihood ratio test, and the distribution of the test statistic is analysis. The computer simulation is used to validate the performance analysis and the robustness of this algorithm at low SNR, compared with other matched subspace detectors.


Sign in / Sign up

Export Citation Format

Share Document