file drawer effect
Recently Published Documents


TOTAL DOCUMENTS

11
(FIVE YEARS 4)

H-INDEX

2
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Bastien Lemaire ◽  
◽  
Raphaël Lenoble ◽  
Mirko Zanon ◽  
Thibaud Jacquel ◽  
...  

Most of the scientific outputs produced by researchers are inaccessible since they are not published in scientific journals: they remain in the researchers' drawers, forming what we call the Dark Science. This is a long-standing issue in research, creating a misleading view of the scientific facts. Contrary to the current literature overfed with positive findings, the Dark Science is nurtured with null findings, replications, flawed experimental designs and other research outputs. Publishers, researchers, institutions and funders all play an important role in the accumulation of those unpublished works, but it is only once we understand the reasons and the benefits of publishing all the scientific findings that we can collectively act to solve the Dark Science problem. In this article, we discuss the causes and consequences of the Dark Science expansion, arguing that science and scientists would benefit from getting all their findings to the light of publication.


2021 ◽  
Author(s):  
Matt Tincani ◽  
Jason C Travers

Questionable research practices (QRPs) are a variety of research choices that introduce bias into the body of scientific literature. Researchers have documented widespread presence of QRPs across disciplines and promoted practices aimed at preventing them. More recently, Single-Case Experimental Design (SCED) researchers have explored how QRPs could manifest in SCED research. In the chapter, we describe QRPs in participant selection, independent variable selection, procedural fidelity documentation, graphical depictions of behavior, and effect size measures and statistics. We also discuss QRPs in relation to the file drawer effect, publication bias, and meta-analyses of SCED research. We provide recommendations for researchers and the research community to promote practices for preventing QRPs in SCED.


2021 ◽  
Author(s):  
Eli Talbert

Using COVID Pulse Data collected by the U.S. Census Bureau I establish that there are weak to nocorrelational relationships between a household reporting a child attending virtual or in-person school andvarious outcomes including expectations of loss of employment, child hunger, anxiety. Due to the coarsenessof the data, it is unclear if this is an artifact of the data or a reflection of the lack of underlying causalrelationships between mode of schooling and the outcomes. Therefore, these results should not be used tomake policy decisions or draw substantive conclusions about the decision to reopen schools and are reportedonly to avoid the file-drawer effect.


Author(s):  
Gary Smith ◽  
Jay Cordes

Researchers seeking fame and funding may be tempted to go on fishing expeditions (p-hacking) or to torture the data to find novel, provocative results that will be picked up by the popular media. Provocative findings are provocative because they are novel and unexpected, and they are often novel and unexpected because they are simply not true. The publication effect (or the file drawer effect) keeps the failures hidden and have created a replication crisis. Research that gets reported in the popular media is often wrong—which fools people and undermines the credibility of scientific research.


2017 ◽  
Author(s):  
Pantelis Samartsidis ◽  
Silvia Montagna ◽  
Angela R. Laird ◽  
Peter T. Fox ◽  
Timothy D. Johnson ◽  
...  

AbstractCoordinate-based meta-analyses (CBMA) allow researchers to combine the results from multiple fMRI experiments with the goal of obtaining results that are more likely to generalise. However, the interpretation of CBMA findings can be impaired by the file drawer problem, a type of publications bias that refers to experiments that are carried out but are not published. Using foci per contrast count data from the BrainMap database, we propose a zero-truncated modelling approach that allows us to estimate the prevalence of non-significant experiments. We validate our method with simulations and real coordinate data generated from the Human Connectome Project. Application of our method to the data from BrainMap provides evidence for the existence of a file drawer effect, with the rate of missing experiments estimated as at least 6 per 100 reported.


2017 ◽  
Author(s):  
Freya Acar ◽  
Ruth Seurinck ◽  
Simon B. Eickhoff ◽  
Beatrijs Moerkerke

AbstractThe importance of integrating research findings is incontrovertible and coordinate based meta-analyses have become a popular approach to combine results of fMRI studies when only peaks of activation are reported. Similar to classical meta-analyses, coordinate based meta-analyses may be subject to different forms of publication bias which impacts results and possibly invalidates findings. We develop a tool that assesses the robustness to potential publication bias on cluster level. We investigate the possible influence of the file-drawer effect, where studies that do not report certain results fail to get published, by determining the number of noise studies that can be added to an existing fMRI meta-analysis before the results are no longer statistically significant. In this paper we illustrate this tool through an example and test the effect of several parameters through extensive simulations. We provide an algorithm for which code is freely available to generate noise studies and enables users to determine the robustness of meta-analytical results.


2017 ◽  
Author(s):  
Jeffrey Robert Spies

There currently exists a gap between scientific values and scientific practices. This gap is strongly tied to the current incentive structure that rewards publication over accurate science. Other problems associated with this gap include reconstructing exploratory narratives as confirmatory, the file drawer effect, an overall lack of archiving and sharing, and a singular contribution model - publication - through which credit is obtained. A solution to these problems is increased disclosure, transparency, and openness. The Open Science Framework (http://openscienceframework.org) is an infrastructure for managing the scientific workflow across the entirety of the scientific process, thus allowing the facilitation and incentivization of openness in a comprehensive manner. The current version of the OSF includes tools for documentation, collaboration, sharing, archiving, registration, and exploration.


2017 ◽  
Vol 61 (6) ◽  
pp. 516 ◽  
Author(s):  
PriscillaJoys Nagarajan ◽  
BharathKumar Garla ◽  
M Taranath ◽  
I Nagarajan

PEDIATRICS ◽  
1996 ◽  
Vol 97 (1) ◽  
pp. 70-70

Statistics can tell us when published numbers truly point to the probability of a negative result, even though we, in our hopes, have mistakenly conferred a positive interpretation. But statistics cannot rescue us . . . when we publish positive results and consign our probable negativities to nonscrutiny in our file drawers.


Sign in / Sign up

Export Citation Format

Share Document