scholarly journals Multiple Approaches to Analyzing Count Data in Studies of Individual Differences: The Propensity for Type I Errors, Illustrated with the Case of Absenteeism Prediction

1999 ◽  
Vol 59 (3) ◽  
pp. 414-430 ◽  
Author(s):  
Michael C. Sturman
Methodology ◽  
2015 ◽  
Vol 11 (3) ◽  
pp. 110-115 ◽  
Author(s):  
Rand R. Wilcox ◽  
Jinxia Ma

Abstract. The paper compares methods that allow both within group and between group heteroscedasticity when performing all pairwise comparisons of the least squares lines associated with J independent groups. The methods are based on simple extension of results derived by Johansen (1980) and Welch (1938) in conjunction with the HC3 and HC4 estimators. The probability of one or more Type I errors is controlled using the improvement on the Bonferroni method derived by Hochberg (1988) . Results are illustrated using data from the Well Elderly 2 study, which motivated this paper.


2020 ◽  
Vol 39 (3) ◽  
pp. 185-208
Author(s):  
Qiao Xu ◽  
Rachana Kalelkar

SUMMARY This paper examines whether inaccurate going-concern opinions negatively affect the audit office's reputation. Assuming that clients perceive the incidence of going-concern opinion errors as a systematic audit quality concern within the entire audit office, we expect these inaccuracies to impact the audit office market share and dismissal rate. We find that going-concern opinion inaccuracy is negatively associated with the audit office market share and is positively associated with the audit office dismissal rate. Furthermore, we find that the decline in market share and the increase in dismissal rate are primarily associated with Type I errors. Additional analyses reveal that the negative consequence of going-concern opinion inaccuracy is lower for Big 4 audit offices. Finally, we find that the decrease in the audit office market share is explained by the distressed clients' reactions to Type I errors and audit offices' lack of ability to attract new clients.


2018 ◽  
Vol 7 (10) ◽  
pp. 409 ◽  
Author(s):  
Youqiang Dong ◽  
Ximin Cui ◽  
Li Zhang ◽  
Haibin Ai

The progressive TIN (triangular irregular network) densification (PTD) filter algorithm is widely used for filtering point clouds. In the PTD algorithm, the iterative densification parameters become smaller over the entire process of filtering. This leads to the performance—especially the type I errors of the PTD algorithm—being poor for point clouds with high density and standard variance. Hence, an improved PTD filtering algorithm for point clouds with high density and variance is proposed in this paper. This improved PTD method divides the iterative densification process into two stages. In the first stage, the iterative densification process of the PTD algorithm is used, and the two densification parameters become smaller. When the density of points belonging to the TIN is higher than a certain value (in this paper, we define this density as the standard variance intervention density), the iterative densification process moves into the second stage. In the second stage, a new iterative densification strategy based on multi-scales is proposed, and the angle threshold becomes larger. The experimental results show that the improved PTD algorithm can effectively reduce the type I errors and total errors of the DIM point clouds by 7.53% and 4.09%, respectively, compared with the PTD algorithm. Although the type II errors increase slightly in our improved method, the wrongly added objective points have little effect on the accuracy of the generated DSM. In short, our improved PTD method perfects the classical PTD method and offers a better solution for filtering point clouds with high density and standard variance.


2019 ◽  
Vol 8 (4) ◽  
pp. 1849-1853

Nowadays people are interested to avail loans in banks for their needs, but providing loans to all people is not possible to banks, so they are using some measures to identify eligible customers. To measure the performance of categorical variables sensitivity and specificity are widely used in Medical and tangentially in econometrics, after using some measures also if banks provide the loans to the wrong customers whom might not able to repay the loans, and not providing to customers who can repay will lead to the type I errors and type II errors, to minimize these errors, this study explains one, how to know sensitivity is large or small and second to study the bench marks on forecasting the model by Fuzzy analysis based on fuzzy based weights and it is compared with the sensitivity analysis.


1998 ◽  
Vol 21 (2) ◽  
pp. 207-208 ◽  
Author(s):  
Lester E. Krueger

Chow pays lip service (but not much more!) to Type I errors and thus opts for a hard (all-or-none) .05 level of significance (Superego of Neyman/Pearson theory; Gigerenzer 1993). Most working scientists disregard Type I errors and thus utilize a soft .05 level (Ego of Fisher; Gigerenzer 1993), which lets them report gradations of significance (e.g., p < .001).


2019 ◽  
Vol 100 (10) ◽  
pp. 1987-2007 ◽  
Author(s):  
Thomas Knutson ◽  
Suzana J. Camargo ◽  
Johnny C. L. Chan ◽  
Kerry Emanuel ◽  
Chang-Hoi Ho ◽  
...  

AbstractAn assessment was made of whether detectable changes in tropical cyclone (TC) activity are identifiable in observations and whether any changes can be attributed to anthropogenic climate change. Overall, historical data suggest detectable TC activity changes in some regions associated with TC track changes, while data quality and quantity issues create greater challenges for analyses based on TC intensity and frequency. A number of specific published conclusions (case studies) about possible detectable anthropogenic influence on TCs were assessed using the conventional approach of preferentially avoiding type I errors (i.e., overstating anthropogenic influence or detection). We conclude there is at least low to medium confidence that the observed poleward migration of the latitude of maximum intensity in the western North Pacific is detectable, or highly unusual compared to expected natural variability. Opinion on the author team was divided on whether any observed TC changes demonstrate discernible anthropogenic influence, or whether any other observed changes represent detectable changes. The issue was then reframed by assessing evidence for detectable anthropogenic influence while seeking to reduce the chance of type II errors (i.e., missing or understating anthropogenic influence or detection). For this purpose, we used a much weaker “balance of evidence” criterion for assessment. This leads to a number of more speculative TC detection and/or attribution statements, which we recognize have substantial potential for being false alarms (i.e., overstating anthropogenic influence or detection) but which may be useful for risk assessment. Several examples of these alternative statements, derived using this approach, are presented in the report.


Author(s):  
Kathrin Möllenhoff ◽  
Florence Loingeville ◽  
Julie Bertrand ◽  
Thu Thuy Nguyen ◽  
Satish Sharan ◽  
...  

Summary The classical approach to analyze pharmacokinetic (PK) data in bioequivalence studies aiming to compare two different formulations is to perform noncompartmental analysis (NCA) followed by two one-sided tests (TOST). In this regard, the PK parameters area under the curve (AUC) and $C_{\max}$ are obtained for both treatment groups and their geometric mean ratios are considered. According to current guidelines by the U.S. Food and Drug Administration and the European Medicines Agency, the formulations are declared to be sufficiently similar if the $90\%$ confidence interval for these ratios falls between $0.8$ and $1.25 $. As NCA is not a reliable approach in case of sparse designs, a model-based alternative has already been proposed for the estimation of $\rm AUC$ and $C_{\max}$ using nonlinear mixed effects models. Here we propose another, more powerful test than the TOST and demonstrate its superiority through a simulation study both for NCA and model-based approaches. For products with high variability on PK parameters, this method appears to have closer type I errors to the conventionally accepted significance level of $0.05$, suggesting its potential use in situations where conventional bioequivalence analysis is not applicable.


Author(s):  
Lars Forsberg ◽  
Bo Jonsson ◽  
Ulf Kristiansson

2008 ◽  
Vol 13 (2) ◽  
pp. 199-220 ◽  
Author(s):  
John R. Skalski ◽  
Richard L. Townsend ◽  
Lyman L. McDonald ◽  
John W. Kern ◽  
Joshua J. Millspaugh

Sign in / Sign up

Export Citation Format

Share Document