REVISITING THE FILE DRAWER PROBLEM IN META-ANALYSIS: AN ASSESSMENT OF PUBLISHED AND NONPUBLISHED CORRELATION MATRICES

2012 ◽  
Vol 65 (2) ◽  
pp. 221-249 ◽  
Author(s):  
DAN R. DALTON ◽  
HERMAN AGUINIS ◽  
CATHERINE M. DALTON ◽  
FRANK A. BOSCO ◽  
CHARLES A. PIERCE
1997 ◽  
Vol 85 (2) ◽  
pp. 719-722 ◽  
Author(s):  
M. T. Bradley ◽  
R. D. Gupta

Although meta-analysis appears to be a useful technique to verify the existence of an effect and to summarize large bodies of literature, there are problems associated with its use and interpretation. Amongst difficulties is the “file drawer problem.” With this problem it is assumed that a certain percentage of studies are not published or are not available to be included in any given meta-analysis. We present a cautionary table to quantify the magnitude of this problem. The table shows that distortions exaggerating the effect size are substantial and that the exaggerations of effects are strongest when the true effect size approaches zero. A meta-analysis could be very misleading were the true effect size close to zero.


2021 ◽  
Author(s):  
Joshua L. Howard

This article introduces the concept of an open science meta-analysis that functions through crowdsourced imputation of data and is thereby perpetually updating. This is proposed to supplement the current journal article-based system of knowledge storage and synthesis and will, a) increase the consumptive capabilities of researchers (i.e., the amount of research one is exposed to), b) minimize cognitive biases that influence scientific knowledge, c) reduce the file-drawer problem, and d) create new knowledge through mass synthesis of existing research. The proposed infrastructure, much like the recent norm of publicly available data, may be viewed as an industry standard in the near future.


2017 ◽  
Author(s):  
Pantelis Samartsidis ◽  
Silvia Montagna ◽  
Angela R. Laird ◽  
Peter T. Fox ◽  
Timothy D. Johnson ◽  
...  

AbstractCoordinate-based meta-analyses (CBMA) allow researchers to combine the results from multiple fMRI experiments with the goal of obtaining results that are more likely to generalise. However, the interpretation of CBMA findings can be impaired by the file drawer problem, a type of publications bias that refers to experiments that are carried out but are not published. Using foci per contrast count data from the BrainMap database, we propose a zero-truncated modelling approach that allows us to estimate the prevalence of non-significant experiments. We validate our method with simulations and real coordinate data generated from the Human Connectome Project. Application of our method to the data from BrainMap provides evidence for the existence of a file drawer effect, with the rate of missing experiments estimated as at least 6 per 100 reported.


2011 ◽  
Vol 2011 (1) ◽  
pp. 1-6 ◽  
Author(s):  
Dan R. Dalton ◽  
Herman Aguinis ◽  
Catherine M. Dalton ◽  
Frank A. Bosco ◽  
Charles A. Pierce

2019 ◽  
Vol 40 (4) ◽  
pp. 416-430 ◽  
Author(s):  
Jessica S. Iwachiw ◽  
Amy Lynn Button ◽  
Jana Atlas

Researchers appear to assume that published research is limited to significant findings. If that is the case, it may be related to perceived or actual publication bias (i.e., journals publishing only significant findings) and/or the file-drawer problem (i.e., researchers not pursuing publication of null results). The lack of published null results can result in faulty decision-making based upon incomplete evidence. Thus, it is important to know the prevalence of, and the contributing factors to, researchers' failure to submit null results. Few studies have addressed this issue in psychology and none have targeted school psychology. Consequently, this study examined the file drawer problem and perception of publication bias among school psychologists. Survey data from 95 school psychology faculty indicated that participants published about half of the studies that they had conducted, suggesting that the file drawer problem is experienced by this population. While lack of time appeared to impact publication pursuit, participants' responses also suggested they believed in publication bias. Obtaining null results substantially impacted the decision to write up studies in pursuit of publication. Therefore, it seems that a sizeable percentage of school psychology research is not available for review by researchers or practitioners.


Sign in / Sign up

Export Citation Format

Share Document