A Note on Sample Size Determination for Akaike Information Criterion (AIC) Approach to Clinical Data Analysis

2005 ◽  
Vol 34 (12) ◽  
pp. 2331-2343 ◽  
Author(s):  
Akifumi Yafune ◽  
Mamoru Narukawa ◽  
Makio Ishiguro
2009 ◽  
Vol 28 (4) ◽  
pp. 679-699 ◽  
Author(s):  
Kaifeng Lu ◽  
Devan V. Mehrotra ◽  
Guanghan Liu

2017 ◽  
Author(s):  
John Kitchener Sakaluk ◽  
Stephen David Short

Sexuality researchers frequently use exploratory factor analysis (EFA), in order to illuminate the distinguishable theoretical constructs assessed by a set of variables. EFA entails a substantive number of analytic decisions to be made with respect to sample size determination, and how factors are extracted, rotated, and retained. The available analytic options, however, are not all equally empirically rigorous. In the present paper, we discuss the commonly available options for conducting EFA, and which constitute best practices for EFA. We also present the results of a methodological review of the analytic options for EFA used by sexuality researchers in over 200 EFAs, published in more than 160 articles and chapters from 1974 to 2014. Our review reveals that best practices for EFA are actually those least frequently used by sexuality researchers. We introduce freely available analytic resources to help make it easier for sexuality researchers to adhere to best practices when conducting EFAs in their own research.


2017 ◽  
Author(s):  
Matthew McBee ◽  
Matthew Makel ◽  
Scott J. Peters ◽  
Michael S. Matthews

The ruinous consequences of currently accepted practices in study design and data analysis have revealed themselves in the low reproducibility of findings in fields such as psychology, medicine, biology, and economics. Because giftedness research relies on the same underlying statistical and sociological paradigms, it is likely that our field also suffers from poor reproducibility and unreliable literature. This paper describes open science practices that will increase the rigor and trustworthiness of gifted education’s scientific processes and their associated findings: open data; open materials; and preregistration of hypotheses, design, sample size determination, and statistical analysis plans. Readers are directed to internet resources for facilitating open science.


1993 ◽  
Vol 32 (05) ◽  
pp. 365-372 ◽  
Author(s):  
T. Timmeis ◽  
J. H. van Bemmel ◽  
E. M. van Mulligen

AbstractResults are presented of the user evaluation of an integrated medical workstation for support of clinical research. Twenty-seven users were recruited from medical and scientific staff of the University Hospital Dijkzigt, the Faculty of Medicine of the Erasmus University Rotterdam, and from other Dutch medical institutions; and all were given a written, self-contained tutorial. Subsequently, an experiment was done in which six clinical data analysis problems had to be solved and an evaluation form was filled out. The aim of this user evaluation was to obtain insight in the benefits of integration for support of clinical data analysis for clinicians and biomedical researchers. The problems were divided into two sets, with gradually more complex problems. In the first set users were guided in a stepwise fashion to solve the problems. In the second set each stepwise problem had an open counterpart. During the evaluation, the workstation continuously recorded the user’s actions. From these results significant differences became apparent between clinicians and non-clinicians for the correctness (means 54% and 81%, respectively, p = 0.04), completeness (means 64% and 88%, respectively, p = 0.01), and number of problems solved (means 67% and 90%, respectively, p = 0.02). These differences were absent for the stepwise problems. Physicians tend to skip more problems than biomedical researchers. No statistically significant differences were found between users with and without clinical data analysis experience, for correctness (means 74% and 72%, respectively, p = 0.95), and completeness (means 82% and 79%, respectively, p = 0.40). It appeared that various clinical research problems can be solved easily with support of the workstation; the results of this experiment can be used as guidance for the development of the successor of this prototype workstation and serve as a reference for the assessment of next versions.


Sign in / Sign up

Export Citation Format

Share Document