scholarly journals THE FILE-DRAWER PROBLEM REVISITED: A GENERAL WEIGHTED METHOD FOR CALCULATING FAIL-SAFE NUMBERS IN META-ANALYSIS

Evolution ◽  
2005 ◽  
Vol 59 (2) ◽  
pp. 464-468 ◽  
Author(s):  
Michael S. Rosenberg
2017 ◽  
Author(s):  
Freya Acar ◽  
Ruth Seurinck ◽  
Simon B. Eickhoff ◽  
Beatrijs Moerkerke

AbstractThe importance of integrating research findings is incontrovertible and coordinate based meta-analyses have become a popular approach to combine results of fMRI studies when only peaks of activation are reported. Similar to classical meta-analyses, coordinate based meta-analyses may be subject to different forms of publication bias which impacts results and possibly invalidates findings. We develop a tool that assesses the robustness to potential publication bias on cluster level. We investigate the possible influence of the file-drawer effect, where studies that do not report certain results fail to get published, by determining the number of noise studies that can be added to an existing fMRI meta-analysis before the results are no longer statistically significant. In this paper we illustrate this tool through an example and test the effect of several parameters through extensive simulations. We provide an algorithm for which code is freely available to generate noise studies and enables users to determine the robustness of meta-analytical results.


2012 ◽  
Vol 65 (2) ◽  
pp. 221-249 ◽  
Author(s):  
DAN R. DALTON ◽  
HERMAN AGUINIS ◽  
CATHERINE M. DALTON ◽  
FRANK A. BOSCO ◽  
CHARLES A. PIERCE

1997 ◽  
Vol 85 (2) ◽  
pp. 719-722 ◽  
Author(s):  
M. T. Bradley ◽  
R. D. Gupta

Although meta-analysis appears to be a useful technique to verify the existence of an effect and to summarize large bodies of literature, there are problems associated with its use and interpretation. Amongst difficulties is the “file drawer problem.” With this problem it is assumed that a certain percentage of studies are not published or are not available to be included in any given meta-analysis. We present a cautionary table to quantify the magnitude of this problem. The table shows that distortions exaggerating the effect size are substantial and that the exaggerations of effects are strongest when the true effect size approaches zero. A meta-analysis could be very misleading were the true effect size close to zero.


2021 ◽  
Author(s):  
Joshua L. Howard

This article introduces the concept of an open science meta-analysis that functions through crowdsourced imputation of data and is thereby perpetually updating. This is proposed to supplement the current journal article-based system of knowledge storage and synthesis and will, a) increase the consumptive capabilities of researchers (i.e., the amount of research one is exposed to), b) minimize cognitive biases that influence scientific knowledge, c) reduce the file-drawer problem, and d) create new knowledge through mass synthesis of existing research. The proposed infrastructure, much like the recent norm of publicly available data, may be viewed as an industry standard in the near future.


2017 ◽  
Author(s):  
Pantelis Samartsidis ◽  
Silvia Montagna ◽  
Angela R. Laird ◽  
Peter T. Fox ◽  
Timothy D. Johnson ◽  
...  

AbstractCoordinate-based meta-analyses (CBMA) allow researchers to combine the results from multiple fMRI experiments with the goal of obtaining results that are more likely to generalise. However, the interpretation of CBMA findings can be impaired by the file drawer problem, a type of publications bias that refers to experiments that are carried out but are not published. Using foci per contrast count data from the BrainMap database, we propose a zero-truncated modelling approach that allows us to estimate the prevalence of non-significant experiments. We validate our method with simulations and real coordinate data generated from the Human Connectome Project. Application of our method to the data from BrainMap provides evidence for the existence of a file drawer effect, with the rate of missing experiments estimated as at least 6 per 100 reported.


2011 ◽  
Vol 2011 (1) ◽  
pp. 1-6 ◽  
Author(s):  
Dan R. Dalton ◽  
Herman Aguinis ◽  
Catherine M. Dalton ◽  
Frank A. Bosco ◽  
Charles A. Pierce

2021 ◽  
pp. 097226292198987
Author(s):  
Sakshi Vashisht ◽  
Poonam Kaushal ◽  
Ravi Vashisht

This study conducted a systematic review and meta-analysis to examine the relationship between emotional intelligence, personality variables (Big V personality traits, self-esteem, self-efficacy, optimism and proactive personality) and career adaptability of students. Data were coded on CMA software version 3.0. Product–moment correlation coefficient (r) was considered as the effect size measure for this study. Publication bias was assessed using Egger’s regression test along with Orwin’s fail-safe N, but no significant publication bias was detected. From the results of 54 studies, it was found that all variables of the study had meta-analytic correlation with career adaptability of students. For heterogeneity, subgroup analysis was conducted, and significant differences were found.


Sign in / Sign up

Export Citation Format

Share Document