Comments on ‘Sample size for equivalence trials: a case study from a vaccine lot consistency trial’ by J. Ganju, A. Izu and A. Anemona

2012 ◽  
Vol 31 (15) ◽  
pp. 1652-1653 ◽  
Author(s):  
Xiao Sun ◽  
Xiaoming Li ◽  
Joshua Chen
2008 ◽  
Vol 27 (19) ◽  
pp. 3743-3754 ◽  
Author(s):  
Jitendra Ganju ◽  
Allen Izu ◽  
Alessandra Anemona

2021 ◽  
pp. 096228022098857
Author(s):  
Yongqiang Tang

Log-rank tests have been widely used to compare two survival curves in biomedical research. We describe a unified approach to power and sample size calculation for the unweighted and weighted log-rank tests in superiority, noninferiority and equivalence trials. It is suitable for both time-driven and event-driven trials. A numerical algorithm is suggested. It allows flexible specification of the patient accrual distribution, baseline hazards, and proportional or nonproportional hazards patterns, and enables efficient sample size calculation when there are a range of choices for the patient accrual pattern and trial duration. A confidence interval method is proposed for the trial duration of an event-driven trial. We point out potential issues with several popular sample size formulae. Under proportional hazards, the power of a survival trial is commonly believed to be determined by the number of observed events. The belief is roughly valid for noninferiority and equivalence trials with similar survival and censoring distributions between two groups, and for superiority trials with balanced group sizes. In unbalanced superiority trials, the power depends also on other factors such as data maturity. Surprisingly, the log-rank test usually yields slightly higher power than the Wald test from the Cox model under proportional hazards in simulations. We consider various nonproportional hazards patterns induced by delayed effects, cure fractions, and/or treatment switching. Explicit power formulae are derived for the combination test that takes the maximum of two or more weighted log-rank tests to handle uncertain nonproportional hazards patterns. Numerical examples are presented for illustration.


2006 ◽  
Vol 32 (3) ◽  
pp. 212-222 ◽  
Author(s):  
Isola Ajiferuke ◽  
Dietmar Wolfram ◽  
Felix Famoye

2018 ◽  
Vol 181 ◽  
pp. 36-44
Author(s):  
Fernando Lucas dos Santos Peixoto de Villanova ◽  
Ana Carolina Chieregati
Keyword(s):  

Author(s):  
Steven Turek ◽  
Sam Anand

When a cylindrical datum feature is specified at maximum material condition (MMC) or least material condition (LMC) a unique circumstance arises: a virtual condition (VC) cylindrical boundary must be defined [1]. The geometric relationship between a cylindrical point cloud obtained from inspection equipment and a VC cylinder has not been specifically addressed in previous research. In this research, novel approaches to this geometric analysis are presented, analyzed, and validated. Two of the proposed methods are new interpretations of established methods applied to this unique geometric circumstance: least squares and the maximum inscribing cylinder (MIC) or minimum circumscribing cylinder (MCC). The third method, the Hull Normal method, is a novel approach specifically developed to address the VC cylinder problem. Each of the proposed methods utilizes a different amount of sampled data, leading to various levels of sensitivity to sample size and error. The three methods were applied to different cylindrical forms, utilizing various sampling techniques and sample sizes. Trends across sample size were analyzed to assess the variation in axial orientation when compared to the true geometric form, and a relevant case study explores the applicability of these methods in real world applications.


2014 ◽  
Vol 26 (5) ◽  
pp. 499-509 ◽  
Author(s):  
Uche Nwabueze

Purpose – The purpose of this paper is to delineate the factors responsible for the decline of total quality management (TQM) in the National Health Service (NHS). It is suggested that if these factors were initially identified and eliminated prior to implementation, the decline of TQM as a strategy for improving the provision and delivery of quality patient care could have been prevented. Design/methodology/approach – The case study approach was chosen because it is the preferred method when “how” or “what” questions are being posed. It is applicable as is evident in this paper where the researcher has little control over events and when the focus is on a contemporary phenomenon within some real-life context. The case study enables the researcher to give an accurate rendition of actual events; it contributes uniquely to the knowledge of individual, organisational, social, and political phenomena. The semi-structured face-to-face interview constituted the main data collection technique of the research. Interviews were held with 23 quality management managers in the British NHS. The central focus of the interview was on “what” factors contributed to the rapid decline of TQM in the NHS. The respondents were chosen because they were directly involved with the implementation of TQM. They were in the vintage position to offer a full insight into the TQM initiative. The analysis of the case is based on Yin's analytic technique of explanation building. Findings – The decline of TQM in the NHS could have been prevented if top executives in hospitals had adopted the sequential steps to quality improvement: In the authors opinion, to land a man on the moon needed a belief in the possibility and breakthrough in the attitudes that viewed space travel as pure science fiction as opposed to a practical reality, and so it should have been with TQM in the NHS. However, the attitude of many NHS managers was that TQM was all right for “other institutions” because “they need it” whereas in the NHS, “we don’t”. This negative attitude should have been overcome if TQM was to be accepted as a corporate, all encompassing philosophy. Research limitations/implications – The limitation of the research may be the sample size of the respondents, which was limited to 23 quality managers that had hands-on experience and the leadership role to lead and implement TQM in the NHS. Future research may consider a broader sample size. It may also be considered for new research to use surveys to identify a broader set of reasons why TQM declined in the NHS. Practical implications – This paper is the first constructive insight to determine reasons for the decline of TQM in the NHS from the individuals who had the sole responsibility for implementation. Any other, group would have amounted to hearsay. Therefore, to constructively delineate the reasons for failure, it was pertinent to learn from the quality managers directly and to ensure that the reasons was representative of their experiences with TQM. The practical implication is to prepare future managers about how to avoid failure. Originality/value – The paper clearly suggests the systematic process required for effective implementation of TQM in a healthcare setting by identifying factors that must be avoided to ensure the successful and sustainable implementation of TQM.


Author(s):  
Zheng Li ◽  
Robert Kluger ◽  
Xianbiao Hu ◽  
Yao-Jan Wu ◽  
Xiaoyu Zhu

The primary objective of this study was to increase the sample size of public probe vehicle-based arterial travel time estimation. The complete methodology of increasing sample size using incomplete trajectory was built based on a k-Nearest Neighbors ( k-NN) regression algorithm. The virtual travel time of an incomplete trajectory was represented by similar complete trajectories. As incomplete trajectories were not used to calculate travel time in previous studies, the sample size of travel time estimation can be increased without collecting extra data. A case study was conducted on a major arterial in the city of Tucson, Arizona, including 13 links. In the case study, probe vehicle data were collected from a smartphone application used for navigation and guidance. The case study showed that the method could significantly increase link travel time samples, but there were still limitations. In addition, sensitivity analysis was conducted using leave-one-out cross-validation to verify the performance of the k-NN model under different parameters and input data. The data analysis showed that the algorithm performed differently under different parameters and input data. Our study suggested optimal parameters should be selected using a historical dataset before real-world application.


Sign in / Sign up

Export Citation Format

Share Document