scholarly journals Testing the Fit of Regression Models Estimated with Extremely Small Samples: Application in Pharmaceutical Stability Studies

Author(s):  
Máté Mihalovits ◽  
Sándor Kemény

Pharmaceutical stability studies are conducted to estimate the shelf life, i.e. the period during which the drug product maintains its identity and stability. In the evaluation of process, regression curve is fitted on the data obtained during the study and the shelf life is determined using the fitted curve. The evaluation process suggested by ICH considers only the case of the true relationship between the measured attribute and time being linear. However, no method is suggested for the practitioner to decide if the linear model is appropriate for their dataset. This is a major problem, as a falsely selected model may distort the estimated shelf life to a great extent, resulting in unreliable quality control. The difficulty of model misspecification detection in stability studies is that very few observations are available. The conventional methods applied for model verification might not be appropriate or efficient due to the small sample size. In this paper, this problem is addressed and some developed methods are proposed to detect model misspecification. The methods can be applied for any process where the regression estimation is performed on independent small samples. Besides stability studies, frequently performed construction of single calibration curves for an analytical measurement is another case where the methods may be applied. It is shown that our methods are statistically appropriate and some of them have high efficiency in the detection of model misspecification when applied in simulated situations which resemble pre-approval and post-approval stability studies.

PEDIATRICS ◽  
1989 ◽  
Vol 83 (3) ◽  
pp. A72-A72
Author(s):  
Student

The believer in the law of small numbers practices science as follows: 1. He gambles his research hypotheses on small samples without realizing that the odds against him are unreasonably high. He overestimates power. 2. He has undue confidence in early trends (e.g., the data of the first few subjects) and in the stability of observed patterns (e.g., the number and identity of significant results). He overestimates significance. 3. In evaluating replications, his or others', he has unreasonably high expectations about the replicability of significant results. He underestimates the breadth of confidence intervals. 4. He rarely attributes a deviation of results from expectations to sampling variability, because he finds a causal "explanation" for any discrepancy. Thus, he has little opportunity to recognize sampling variation in action. His belief in the law of small numbers, therefore, will forever remain intact.


2021 ◽  
Vol 2 (1) ◽  
pp. 148-179
Author(s):  
Mohammad Jahangir Hossain Mojumder

Nowadays, demands are growing for outcome-based and transferable learning, particularly in higher education. Being the terminal formal schooling, it needs facilitation of pupils’ achievement of problem-solving skills for real-life by teachers. To this end, this qualitative research employs a case study approach, which is suitable to test an event with small samples, and a phenomenological method to analyze respondents’ perceptions and activities thematically and descriptively to assess changes. In-depth interviews, focus group discussions, and class observations are used to collect data from two selected colleges to examine the extent of professional development and methodological shift in teaching as effects of training to include active learning strategies for better learning outcomes. The data though reveals that the selected flagship training program offers a bunch of pedagogical methods (not need-based) to imbibe, yet reject the idea that the nationally arranged training remains a successful effort to increase trainees’ knowledge, skills, and polish attitudes except disseminating a few concepts superficially. Moreover, trainees lack the motivation to shift their teaching habits and are unconvinced that the application of these newly learned strategies will transform anything. Likewise, they are discontented about training contents and unenthusiastic in consort with unfavorable opinions about training procedures and trainers to some extent. Therefore, the results suggest limited or no significant professional development and modification in teaching practice, rather teachers continue conventional teacher-centered method, and the effort stays insufficient, extraneous, ‘fragmented’, and ‘intellectually superficial’. Additionally, at the colleges, large class size, inappropriate sitting arrangement, pervasive traditionality, absenteeism, and other analogous challenges limited them to change their practice. Considering all these, this study suggests that alternations should be initiated at a micro (teachers & college) and macro-level (training providers & policymakers) to offer tailor-made, autonomous, and need-based training. Last but not the least, this endeavor is limited by being entirely qualitative with a small sample size and not eliciting the views of any of the trainers and policymakers and which can be an indication of points of departure for future study.


Foods ◽  
2020 ◽  
Vol 9 (7) ◽  
pp. 919
Author(s):  
Rita Maio ◽  
Juan García-Díez ◽  
Cristina Saraiva

Currently, food waste represents an important issue due to its negative economic, social and environmental impact. To reduce the food waste levels, some retailers’ brands implement discounting based on the proximity to expiry. Since this practice may involve potential food poisoning, a total of 94 food products from animal origin, purchased in two supermarkets in North Portugal on the expiry date, were analyzed for selected foodborne and spoilage microorganisms. Moreover, the samples were classified as satisfactory and not satisfactory according to their microbiological quality. The results showed that none of the samples presented counts for Salmonella spp., S. aureus, B. cereus. L. monocytogenes was detected in one sample over the limit of 2 log cfu/g as defined by Regulation 2073/2005. The evaluation of food hygiene and spoilage indicators showed that the processed foods displayed lower counts than raw products (beef, pork, chicken and fish). Regarding Enterobacteriaceae, raw products presented on average over 2 log cfu/g than processed foods, with the exception of beef samples that accounted over 3 log cfu/g more than processed foods. In addition, E. coli was mainly detected in fresh meat of which chicken and pork displayed the highest counts. Regarding the qualitative classification, 51.06% of the samples were not satisfactory for the total mesophilic counts, while 62.76% and 58.51% displayed positive results for Enterobacteriaceae and molds and yeasts (M&Y) criteria, respectively. In all, 70.21% of the samples analyzed at the expiry date failed, at least, in one microbiological criterion. The results indicate that the foods available at the end of the shelf life in supermarkets do not represent a risk for food poisoning due to the absence of foodborne pathogens. Since the microbiological indicators of storage/handling of raw products were mainly unsatisfactory, this indicates that the sale of these perishable foods at the end of the shelf life may not be recommended. On the other hand, processed products subjected to food conservation procedures (i.e., thermal processing) could be sold at the end of their shelf life or donated beyond the best-before date, due to its physical, chemical and microbiological stability. However, evidences of foodborne outbreaks associated to this kind of foodstuffs indicated the need of a proper risk assessment. Moreover, it is important to remark that other factors such as small sample size, the absence of the evaluation of the handling, and storage conditions along the food chain or organoleptic alterations must be assessed in further studies.


Author(s):  
J. Mullaert ◽  
M. Bouaziz ◽  
Y. Seeleuthner ◽  
B. Bigio ◽  
J-L. Casanova ◽  
...  

AbstractMany methods for rare variant association studies require permutations to assess the significance of tests. Standard permutations assume that all individuals are exchangeable and do not take population stratification (PS), a known confounding factor in genetic studies, into account. We propose a novel strategy, LocPerm, in which individuals are permuted only with their closest ancestry-based neighbors. We performed a simulation study, focusing on small samples, to evaluate and compare LocPerm with standard permutations and classical adjustment on first principal components. Under the null hypothesis, LocPerm was the only method providing an acceptable type I error, regardless of sample size and level of stratification. The power of LocPerm was similar to that of standard permutation in the absence of PS, and remained stable in different PS scenarios. We conclude that LocPerm is a method of choice for taking PS and/or small sample size into account in rare variant association studies.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Vahid Ebrahimi ◽  
Zahra Bagheri ◽  
Zahra Shayan ◽  
Peyman Jafari

Assessing differential item functioning (DIF) using the ordinal logistic regression (OLR) model highly depends on the asymptotic sampling distribution of the maximum likelihood (ML) estimators. The ML estimation method, which is often used to estimate the parameters of the OLR model for DIF detection, may be substantially biased with small samples. This study is aimed at proposing a new application of the elastic net regularized OLR model, as a special type of machine learning method, for assessing DIF between two groups with small samples. Accordingly, a simulation study was conducted to compare the powers and type I error rates of the regularized and nonregularized OLR models in detecting DIF under various conditions including moderate and severe magnitudes of DIF ( DIF = 0.4   and   0.8 ), sample size ( N ), sample size ratio ( R ), scale length ( I ), and weighting parameter ( w ). The simulation results revealed that for I = 5 and regardless of R , the elastic net regularized OLR model with w = 0.1 , as compared with the nonregularized OLR model, increased the power of detecting moderate uniform DIF ( DIF = 0.4 ) approximately 35% and 21% for N = 100   and   150 , respectively. Moreover, for I = 10 and severe uniform DIF ( DIF = 0.8 ), the average power of the elastic net regularized OLR model with 0.03 ≤ w ≤ 0.06 , as compared with the nonregularized OLR model, increased approximately 29.3% and 11.2% for N = 100   and   150 , respectively. In these cases, the type I error rates of the regularized and nonregularized OLR models were below or close to the nominal level of 0.05. In general, this simulation study showed that the elastic net regularized OLR model outperformed the nonregularized OLR model especially in extremely small sample size groups. Furthermore, the present research provided a guideline and some recommendations for researchers who conduct DIF studies with small sample sizes.


1967 ◽  
Vol 11 ◽  
pp. 185-190
Author(s):  
George H. Glade

AbstractManufacture of reed switches, critical components in present-day dataprocessing support devices, requires a means of accurate, rapid analysis of elements used in plating tlie levers of the switch. Because of gready reduced feedback time, X-ray spectroscopy has replaced metallographic sectioning and optical measurement as a plating-thickness control method. While 6 hr were required to obtain thickness data for a given sample size by sectioning, X-ray spectroscopy requires only 2 hr, which permits better control of the plating operating. X-ray spectroscopy is now used routinely to control both gold and rhodium plating thicknesses in the 20- to 100μin. (1 × 10−6) thickness range. The large number of samples prevents long count duration, while the small sample size (0.110 by 0.033 in.) reduces the precision of the analysis. However, the precision of the X-ray and optical methods is approximately the same, 8% variance. X-ray accuracy is comparable to that of sectioning since the standards are obtained by sectioning. Simplicity of operation is required since relatively untrained operators are used. An aperture system is used to reduce background. The rhodium thickness measurement is obtained from gross rhodium intensity. Attenuation of gross nickel intensity from the base material was found to be a better measure of gold thickness intensity. Calibration for hoth gold and rhodium is performed by using the same wide detector conditions. The choice of analysis is made by changing only the 2θ angle, thus avoiding the time required for recalibration when changing analysis.


2020 ◽  
pp. 001316442095806
Author(s):  
Shiyang Su ◽  
Chun Wang ◽  
David J. Weiss

[Formula: see text] is a popular item fit index that is available in commercial software packages such as flexMIRT. However, no research has systematically examined the performance of [Formula: see text] for detecting item misfit within the context of the multidimensional graded response model (MGRM). The primary goal of this study was to evaluate the performance of [Formula: see text] under two practical misfit scenarios: first, all items are misfitting due to model misspecification, and second, a small subset of items violate the underlying assumptions of the MGRM. Simulation studies showed that caution should be exercised when reporting item fit results of polytomous items using [Formula: see text] within the context of the MGRM, because of its inflated false positive rates (FPRs), especially with a small sample size and a long test. [Formula: see text] performed well when detecting overall model misfit as well as item misfit for a small subset of items when the ordinality assumption was violated. However, under a number of conditions of model misspecification or items violating the homogeneous discrimination assumption, even though true positive rates (TPRs) of [Formula: see text] were high when a small sample size was coupled with a long test, the inflated FPRs were generally directly related to increasing TPRs. There was also a suggestion that performance of [Formula: see text] was affected by the magnitude of misfit within an item. There was no evidence that FPRs for fitting items were exacerbated by the presence of a small percentage of misfitting items among them.


2019 ◽  
Author(s):  
Mary Ann Vega ◽  
Jesse Holzman ◽  
Ningning Zhao ◽  
Barbara Risman

In a response to Spencer Garrison’s 2018 article “On the Limits of ‘Trans Enough’: Authenticating Trans Identity Narratives”, we argue that despite the author’s contributions to the theoretical understanding of non-binary identities, there are serious issues with the paper. We challenge Garrison’s assertions that non-binary individuals reproduce the gender binary in that they are more likely than binary trans individuals to use essentialist narratives to construct their gender identity. Due to Garrison’s criteria for participation in his study, all participants identified as trans and were in the process of transitioning. However, not all non-binary individuals identify as trans and consequently Garrison’s theorizing from a small sample of trans identified non-binary folks is problematic. We argue that the small sample size and Garrison’s recruitment strategy do not capture the wide range of individuals who identify as non-binary and that this small sample is not enough for Garrison to generalize about non-binary identified folks. Gender identity is a complex issue and while we commend Garrison for his contribution to the field, we argue that such work should be done with diligence as to ensure that research does not render identities invisible in early attempts to theorize with small samples.


2022 ◽  
Author(s):  
Mia S. Tackney ◽  
Tim Morris ◽  
Ian White ◽  
Clemence Leyrat ◽  
Karla Diaz-Ordaz ◽  
...  

Abstract Adjustment for baseline covariates in randomized trials has been shown to lead to gains in power and can protect against chance imbalances in covariates. For continuous covariates, there is a risk that the the form of the relationship between the covariate and outcome is misspecified when taking an adjusted approach. Using a simulation study focusing on small to medium-sized individually randomized trials, we explore whether a range of adjustment methods are robust to misspecification, either in the covariate-outcome relationship or through an omitted covariate-treatment interaction. Specifically, we aim to identify potential settings where G-computation, Inverse Probability of Treatment Weighting ( IPTW ), Augmented Inverse Probability of Treatment Weighting ( AIPTW ) and Targeted Maximum Likelihood Estimation ( TMLE ) offer improvement over the commonly used Analysis of Covariance ( ANCOVA ). Our simulations show that all adjustment methods are generally robust to model misspecification if adjusting for a few covariates, sample size is 100 or larger, and there are no covariate-treatment interactions. When there is a non-linear interaction of treatment with a skewed covariate and sample size is small, all adjustment methods can suffer from bias; however, methods that allow for interactions (such as G-computation with interaction and IPTW ) show improved results compared to ANCOVA . When there are a high number of covariates to adjust for, ANCOVA retains good properties while other methods suffer from under- or over-coverage. An outstanding issue for G-computation, IPTW and AIPTW in small samples is that standard errors are underestimated; development of small sample corrections is needed.


Water ◽  
2020 ◽  
Vol 12 (9) ◽  
pp. 2422
Author(s):  
Ayodeji O. Adegun ◽  
Thompson A. Akinnifesi ◽  
Isaac A. Ololade ◽  
Rosa Busquets ◽  
Peter S. Hooda ◽  
...  

The Owena River Basin in Nigeria is an area of agricultural importance for the production of cocoa. To optimise crop yield, the cocoa trees require spraying with neonicotinoid insecticides (Imidacloprid, Thiacloprid Acetamiprid and Thiamethoxam). It is proposed that rainwater runoff from the treated area may pollute the Owena River and that these pesticides may thereby enter the human food chain via six species of fish (Clariasgariepinus, Clariasanguillaris, Sarotherodongalilaeus, Parachannaobscura, Oreochromisniloticus and Gymnarchusniloticus) which are cultured in the river mostly for local consumption. This work aims to establish a working method to quantify the likely levels of the insecticides in the six species of fish, firstly by undertaking a laboratory-based study employing the QuEChERS method to extract the four neonicotinoids from fish purchased in marketplace in the UK, spiked with known quantities of the pesticide and using liquid chromatography coupled with tandem mass spectrometry (LC-MS-MS) as the detection method; secondly, by using these samples to optimise the detection method for very low levels of pesticides, then applying the optimised techniques to the analysis of three of each six species of fish taken from the Owena River. A significant benefit of this combined technique is that only small samples of fish are required. Success with this part of the study showed that very low concentrations of the insecticides could be detected in fish muscle. The third aim is to apply a simple quantitative risk assessment model using the data sets obtained, together with information about daily diet, human body weight and recommended safety limits of pesticides in food to illustrate how human health may be affected by the consumption of these fish. The multiple determinations of neonicotinoids in edible fishes in Nigeria are pioneer research and fill a gap in addressing the relationship between waterborne pesticides and food quality in the country. Fundamentally, this work is an exercise to demonstrate the applicability of the aforementioned instrumental method of analysis to fish muscle, which requires only a small sample size of fish; a large number of fish is not required for a proof of concept, in this case. Although not a monitoring programme for the whole Owena River Basin ecosystem per se, this work successfully demonstrates the technical feasibility of a system of chemical analysis and establishes the foundation for ecological surveys in the immediate future. Parameters involving exposures to xenobiotics in ecotoxicological modelling can now be expressed in terms of both mass and molar concentrations of a chemical in animal tissues if so desired.


Sign in / Sign up

Export Citation Format

Share Document