Power and Type I errors for pairwise comparisons of means in the unequal variances case

2009 ◽  
Vol 62 (2) ◽  
pp. 263-281 ◽  
Author(s):  
Philip H. Ramsey ◽  
Patricia P. Ramsey
Methodology ◽  
2015 ◽  
Vol 11 (3) ◽  
pp. 110-115 ◽  
Author(s):  
Rand R. Wilcox ◽  
Jinxia Ma

Abstract. The paper compares methods that allow both within group and between group heteroscedasticity when performing all pairwise comparisons of the least squares lines associated with J independent groups. The methods are based on simple extension of results derived by Johansen (1980) and Welch (1938) in conjunction with the HC3 and HC4 estimators. The probability of one or more Type I errors is controlled using the improvement on the Bonferroni method derived by Hochberg (1988) . Results are illustrated using data from the Well Elderly 2 study, which motivated this paper.


2020 ◽  
Vol 18 (2) ◽  
pp. 2-9
Author(s):  
Rand Wilcox

Let p1,…, pJ denote the probability of a success for J independent random variables having a binomial distribution and let p(1) ≤ … ≤ p(J) denote these probabilities written in ascending order. The goal is to make a decision about which group has the largest probability of a success, p(J). Let p̂1,…, p̂J denote estimates of p1,…,pJ, respectively. The strategy is to test J − 1 hypotheses comparing the group with the largest estimate to each of the J − 1 remaining groups. For each of these J − 1 hypotheses that are rejected, decide that the group corresponding to the largest estimate has the larger probability of success. This approach has a power advantage over simply performing all pairwise comparisons. However, the more obvious methods for controlling the probability of one more Type I errors perform poorly for the situation at hand. A method for dealing with this is described and illustrated.


1979 ◽  
Vol 86 (4) ◽  
pp. 884-888 ◽  
Author(s):  
Harvey J. Keselman ◽  
Paul A. Games ◽  
Joanne C. Rogan

1987 ◽  
Vol 12 (3) ◽  
pp. 271-281 ◽  
Author(s):  
Rand R. Wilcox

When testing the equality of the means of J independent normal distributions, two-stage procedures have the advantage of providing exact control over both Type I errors and power even when the variances are unequal. Single-stage procedures have been proposed for handling unequal variances, but recent results (which are briefly reviewed in this paper) have shown that in certain practical situations these approximate solutions give unsatisfactory control over both Type I errors and power. A practical problem with two-stage procedures is the assumption that an equal number of observations is sampled from each treatment group in the first stage. Of course, for various reasons, a researcher might conduct a study resulting in unequal sample sizes. This paper describes a simple yet accurate method for dealing with this problem.


2020 ◽  
Vol 39 (3) ◽  
pp. 185-208
Author(s):  
Qiao Xu ◽  
Rachana Kalelkar

SUMMARY This paper examines whether inaccurate going-concern opinions negatively affect the audit office's reputation. Assuming that clients perceive the incidence of going-concern opinion errors as a systematic audit quality concern within the entire audit office, we expect these inaccuracies to impact the audit office market share and dismissal rate. We find that going-concern opinion inaccuracy is negatively associated with the audit office market share and is positively associated with the audit office dismissal rate. Furthermore, we find that the decline in market share and the increase in dismissal rate are primarily associated with Type I errors. Additional analyses reveal that the negative consequence of going-concern opinion inaccuracy is lower for Big 4 audit offices. Finally, we find that the decrease in the audit office market share is explained by the distressed clients' reactions to Type I errors and audit offices' lack of ability to attract new clients.


2011 ◽  
Vol 81 (2) ◽  
pp. 125-135 ◽  
Author(s):  
Philip H. Ramsey ◽  
Kyrstle Barrera ◽  
Pri Hachimine-Semprebom ◽  
Chang-Chia Liu

2018 ◽  
Vol 7 (10) ◽  
pp. 409 ◽  
Author(s):  
Youqiang Dong ◽  
Ximin Cui ◽  
Li Zhang ◽  
Haibin Ai

The progressive TIN (triangular irregular network) densification (PTD) filter algorithm is widely used for filtering point clouds. In the PTD algorithm, the iterative densification parameters become smaller over the entire process of filtering. This leads to the performance—especially the type I errors of the PTD algorithm—being poor for point clouds with high density and standard variance. Hence, an improved PTD filtering algorithm for point clouds with high density and variance is proposed in this paper. This improved PTD method divides the iterative densification process into two stages. In the first stage, the iterative densification process of the PTD algorithm is used, and the two densification parameters become smaller. When the density of points belonging to the TIN is higher than a certain value (in this paper, we define this density as the standard variance intervention density), the iterative densification process moves into the second stage. In the second stage, a new iterative densification strategy based on multi-scales is proposed, and the angle threshold becomes larger. The experimental results show that the improved PTD algorithm can effectively reduce the type I errors and total errors of the DIM point clouds by 7.53% and 4.09%, respectively, compared with the PTD algorithm. Although the type II errors increase slightly in our improved method, the wrongly added objective points have little effect on the accuracy of the generated DSM. In short, our improved PTD method perfects the classical PTD method and offers a better solution for filtering point clouds with high density and standard variance.


2019 ◽  
Vol 8 (4) ◽  
pp. 1849-1853

Nowadays people are interested to avail loans in banks for their needs, but providing loans to all people is not possible to banks, so they are using some measures to identify eligible customers. To measure the performance of categorical variables sensitivity and specificity are widely used in Medical and tangentially in econometrics, after using some measures also if banks provide the loans to the wrong customers whom might not able to repay the loans, and not providing to customers who can repay will lead to the type I errors and type II errors, to minimize these errors, this study explains one, how to know sensitivity is large or small and second to study the bench marks on forecasting the model by Fuzzy analysis based on fuzzy based weights and it is compared with the sensitivity analysis.


Sign in / Sign up

Export Citation Format

Share Document