scholarly journals Preregistration specificity & adherence: A review of preregistered gambling studies & cross-disciplinary comparison

2021 ◽  
Author(s):  
Robert Heirene ◽  
Debi LaPlante ◽  
Eric R. Louderback ◽  
Brittany Keen ◽  
Marjan Bakker ◽  
...  

Study preregistration is one of several “open science” practices (e.g., open data, preprints) that researchers use to improve the transparency and rigour of their research. As more researchers adopt preregistration as a regular research practice, examining the nature and content of preregistrations can help identify strengths and weaknesses of current practices. The value of preregistration, in part, relates to the specificity of the study plan and the extent to which investigators adhere to this plan. We identified 53 preregistrations from the gambling studies field meeting our predefined eligibility criteria and scored their level of specificity using a 23-item protocol developed to measure the extent to which a clear and exhaustive preregistration plan restricts various researcher degrees of freedom (RDoF; i.e., the many methodological choices available to researchers when collecting and analysing data, and when reporting their findings). We also scored studies on a 32-item protocol that measured adherence to the preregistered plan in the study manuscript. We found that gambling preregistrations had low specificity levels on most RDoF. However, a comparison with a sample of cross-disciplinary preregistrations (N = 52; Bakker et al., 2020) indicated that gambling preregistrations scored higher on 12 (of 29) items. Thirteen (65%) of the 20 associated published articles or preprints deviated from the protocol without declaring as much (the mean number of undeclared deviations per article was 2.25, SD = 2.34). Overall, while we found improvements in specificity and adherence over time (2017-2020), our findings suggest the purported benefits of preregistration—including increasing transparency and reducing RDoF—are not fully achieved by current practices. Using our findings, we provide 10 practical recommendations that can be used to support and refine preregistration practices.

2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S399-S399
Author(s):  
Derek M Isaacowitz ◽  
Jonathan W King

Abstract Scientists from many disciplines have recently suggested changes in research practices, with the goal of ensuring greater scientific integrity. Some suggestions have focused on reducing researcher degrees of freedom to extract significant findings from exploratory analyses, whereas others concern how best to power studies and analyze results. Yet others involve ensuring that other interested researchers can easily access study materials, code, and data, to help with re-analysis and/or replication. These changes are moving targets, with discussions and suggested practices ongoing. However, aging researchers have not yet been major participants in these discussions, and aging journals are just starting to consider open science policies. This symposium, sponsored by the GSA Publications Committee, will highlight transparency and open science practices that seem most relevant to aging researchers, discuss potential challenges to implementing them as well as reasons for doing so, and will consider how aging journals may implement these practices. Open science practices to be considered include: preregistration, open data, open materials and code, sample size justification and analytic tools for considering null effects. Presenters from a range of areas of aging research (lab, secondary data, qualitative) will show examples of open science practices in their work and will discuss concerns about, and challenges of, implementing them. Then, editorial team members will discuss the implications of these changes for aging journals. Finally, discussant Jon King will give NIA’s perspective on the importance of encouraging open science practices in the aging field.


2020 ◽  
Author(s):  
David Moreau ◽  
Beau Gamble

Psychology researchers are rapidly adopting open science practices, yet clear guidelines on how to apply these practices to meta-analysis remain lacking. In this tutorial, we describe why open science is important in the context of meta-analysis in psychology, and suggest how to adopt the three main components of open science: preregistration, open materials, and open data. We first describe how to make the preregistration as thorough as possible—and how to handle deviations from the plan. We then focus on creating easy-to-read materials (e.g., search syntax, R scripts) to facilitate reproducibility and bolster the impact of a meta-analysis. Finally, we suggest how to organize data (e.g., literature search results, data extracted from studies) that are easy to share, interpret, and update as new studies emerge. For each step of the meta-analysis, we provide example templates, accompanied by brief video tutorials, and show how to integrate these practices into the Open Science Framework (https://osf.io/q8stz/).


2020 ◽  
Vol 36 (3) ◽  
pp. 263-279
Author(s):  
Isabel Steinhardt

Openness in science and education is increasing in importance within the digital knowledge society. So far, less attention has been paid to teaching Open Science in bachelor’s degrees or in qualitative methods. Therefore, the aim of this article is to use a seminar example to explore what Open Science practices can be taught in qualitative research and how digital tools can be involved. The seminar focused on the following practices: Open data practices, the practice of using the free and open source tool “Collaborative online Interpretation, the practice of participating, cooperating, collaborating and contributing through participatory technologies and in social (based) networks. To learn Open Science practices, the students were involved in a qualitative research project about “Use of digital technologies for the study and habitus of students”. The study shows the practices of Open Data are easy to teach, whereas the use of free and open source tools and participatory technologies for collaboration, participation, cooperation and contribution is more difficult. In addition, a cultural shift would have to take place within German universities to promote Open Science practices in general.


2021 ◽  
Author(s):  
Tamara Kalandadze ◽  
Sara Ann Hart

The increasing adoption of open science practices in the last decade has been changing the scientific landscape across fields. However, developmental science has been relatively slow in adopting open science practices. To address this issue, we followed the format of Crüwell et al., (2019) and created summaries and an annotated list of informative and actionable resources discussing ten topics in developmental science: Open science; Reproducibility and replication; Open data, materials and code; Open access; Preregistration; Registered reports; Replication; Incentives; Collaborative developmental science.This article offers researchers and students in developmental science a starting point for understanding how open science intersects with developmental science. After getting familiarized with this article, the developmental scientist should understand the core tenets of open and reproducible developmental science, and feel motivated to start applying open science practices in their workflow.


2019 ◽  
Author(s):  
Olivia J Kirtley ◽  
Ginette Lafit ◽  
Robin Achterhof ◽  
Anu Pauliina Hiekkaranta ◽  
Inez Myin-Germeys

A growing interest in understanding complex and dynamic psychological processes as they occur in everyday life has led to an increase in studies using Ambulatory Assessment techniques, including the Experience Sampling Method (ESM) and Ecological Momentary Assessment (EMA). There are, however, numerous “forking paths” and researcher degrees of freedom, even beyond those typically encountered with other research methodologies. Whilst a number of researchers working with ESM techniques are actively engaged in efforts to increase the methodological rigor and transparency of such research, currently, there is little routine implementation of open science practices in ESM research. In the current paper, we discuss the ways in which ESM research is especially vulnerable to threats to transparency, reproducibility and replicability. We propose that greater use of (pre-)registration, a cornerstone of open science, may address some of these threats to the transparency of ESM research. (Pre-)registration of ESM research is not without challenges, including model selection, accounting for potential model convergence issues and the use of pre-existing datasets. As these may prove to be significant barriers to (pre-)registration for ESM researchers, we also discuss ways of overcoming these challenges and of documenting them in a (pre-)registration. A further challenge is that current general templates do not adequately capture the unique features of ESM. Here we present a (pre-)registration template for ESM research, adapted from the original Pre-Registration Challenge (Mellor et al., 2019) and pre-registration of pre-existing data (van den Akker et al., 2020) templates, and provide examples of how to complete this.


2018 ◽  
Author(s):  
Sarah Jane Charles ◽  
James Edward Bartlett ◽  
Kyle J. Messick ◽  
Thomas Joseph Coleman ◽  
Alex Uzdavines

There is a push in psychology toward more transparent practices, stemming partially as a response to the replication crisis. We argue that the psychology of religion should help lead the way toward these new, more transparent practices to ensure a robust and dynamic subfield. One of the major issues that proponents of Open Science practices hope to address is researcher degrees of freedom (RDF). We pre-registered and conducted a systematic review of the 2017 issues from three psychology of religion journals. We aimed to identify the extent to which the psychology of religion has embraced Open Science practices and the role of RDF within the subfield. We found that many of the methodologies that help to increase transparency, such as pre-registration, have yet to be adopted by those in the subfield. In light of these findings, we present recommendations for addressing the issue of transparency in the psychology of religion and outline how to move toward these new Open Science practices.


2020 ◽  
Author(s):  
Denis Cousineau

Born-Open Data experiments are encouraged for better open science practices. To be adopted, Born-Open data practices must be easy to implement. Herein, I introduce a package for E-Prime such that the data files are automatically saved on a GitHub repository. The BornOpenData package for E-Prime works seamlessly and performs the upload as soon as the experiment is finished so that there is no additional steps to perform beyond placing a package call within E-Prime. Because E-Prime files are not standard tab-separated files, I also provide an R function that retrieves the data directly from GitHub into a data frame ready to be analyzed. At this time, there are no standards as to what should constitute an adequate open-access data repository so I propose a few suggestions that any future Born-Open data system could follow for easier use by the research community.


2022 ◽  
Author(s):  
Bermond Scoggins ◽  
Matthew Peter Robertson

The scientific method is predicated on transparency -- yet the pace at which transparent research practices are being adopted by the scientific community is slow. The replication crisis in psychology showed that published findings employing statistical inference are threatened by undetected errors, data manipulation, and data falsification. To mitigate these problems and bolster research credibility, open data and preregistration have increasingly been adopted in the natural and social sciences. While many political science and international relations journals have committed to implementing these reforms, the extent of open science practices is unknown. We bring large-scale text analysis and machine learning classifiers to bear on the question. Using population-level data -- 93,931 articles across the top 160 political science and IR journals between 2010 and 2021 -- we find that approximately 21% of all statistical inference papers have open data, and 5% of all experiments are preregistered. Despite this shortfall, the example of leading journals in the field shows that change is feasible and can be effected quickly.


2018 ◽  
Author(s):  
Gerit Pfuhl ◽  
Jon Grahe

Watch the VIDEO.Recent years have seen a revolution in publishing, and large support for open access publishing. There has been a slower acceptance and transition to other open science principles such as open data, open materials, and preregistration. To accelerate the transition and make open science the new standard, the collaborative replications and education project (CREP; http://osf.io/wfc6u/)) was launched in 2013, hosted on the Open Science Framework (osf.io). OSF is like a preprint, collecting partial data with each individual contributors project. CREP introduces open science at the start of academic research, facilitating student research training in open science and solidifying behavioral science results. The CREP team attempts to achieve this by inviting contributors to replicate one of several replication studies selected for scientific impact and suitability for undergraduates to complete during one academic term. Contributors follow clear protocols with students interacting with a CREP team that reviews the materials and video of the procedure to ensure quality data collection while students are learning science practices and methods. By combining multiple replications from undergraduates across the globe, the findings can be pooled to conduct meta-analysis and so contribute to generalizable and replicable research findings. CREP is careful to not interpret any single result. CREP has recently joined forces with the psychological science accelerator (PsySciAcc), a globally distributed network of psychological laboratories accelerating the accumulation of reliable and generalizable results in the behavioral sciences. The Department of Psychology at UiT is part of the network and has two ongoing CREP studies, maintaining open science practices early on. In this talk, we will present our experiences of conducting transparent replicable research, and experience with preprints from a supervisor and researcher perspective.


2019 ◽  
Author(s):  
Nate Breznau

Reliability, transparency and ethical crises pushed psychology to reorganize as a discipline over the last decade. Political science also shows signs of reworking itself in response to these crises. Sociology sits on the sidelines. There have not been the same reliability or ethical scandals, at least not in the limelight, nor has there been strong disciplinary moves toward open science. This paper therefore investigates sociology as a discipline looking at current practices, definitions of sociology, positions of sociological associations and a brief consideration of the arguments of three highly influential sociologists: Weber, Merton and Habermas. Based on this disciplinary review, I suggest that sociology is no different from its neighboring disciplines in terms of reliability or ethical dilemmas. Therefore, sociology should adopt open science practices immediately. Weber, Merton and Habermas – three very different social thinkers epistemologically – offer strong arguments that favor what we know as “open science” today. Open science promotes ethics and reliability, reduces fraud and ultimately increases the value of sociology for policymakers and the public. The paper concludes with some basic steps individual researchers can take to move sociology toward open science.


Sign in / Sign up

Export Citation Format

Share Document