Preregistration and Registered Reports

2021 ◽  
Author(s):  
Justin Reich

Preregistration and registered reports are two promising open science practices for increasing transparency in the scientific process. In particular, they create transparency around one of the most consequential distinctions in research design: the data analytics decisions made before data collection and post-hoc decisions made afterwards. Preregistration involves publishing a time-stamped record of a study design before data collection or analysis. Registered reports are a publishing approach that facilitates the evaluation of research without regard for the direction or magnitude of findings. In this paper, I evaluate opportunities and challenges for these open science methods, offer initial guidelines for their use, explore relevant tensions around new practices, and illustrate examples from educational psychology and social science. This paper was accepted for publication in Educational Psychologist volume 56, issue 2; scheduled for April 2021, as a part of a special issue titled, “Educational psychology in the open science era.”This preprint has been peer reviewed, but not copy edited by the journal and may differ from the final published version. The DOI of the final published version is: [insert preprint DOI number]. Once the article is published online, it will be available at the following permanent link: [insert doi link]

2020 ◽  
Author(s):  
Carly D Robinson

Pre-registration and registered reports are two of the most promising open science practices for increasing transparency in the scientific process. Pre-registration involves publishing a timestamped record of a study design, ideally before data collection and analysis, so that research consumers can discern which analytic decisions were set a priori and which were changed after seeing data. Registered reports take the idea of pre-registration one step further, and provide peer review at the pre-registration stage. Researchers submit a Phase I manuscript that contains the introduction, background and context, and methods section of a study, and these Phase I manuscripts are peer reviewed. If reviewed positively, manuscripts are given in-principle acceptance, where the editors agree that if the researchers conduct the study as pre-registered--or document the deviations from their plan--the study will be published without regard for the direction or magnitude of findings. In this manner, studies are judged by whether they address important questions and use well-designed methods, not on the basis of reaching specific benchmarks for significance or effect size. This article illustrates the emerging range of approaches to pre-registration and registered reports with examples from a variety of studies and from the first special issue in educational research devoted to Registered Reports.PLEASE DO NOT CITE YET:This article is part of a forthcoming journal Special Issue on Open Science in Education and currently under review. Carly Robinson is NOT the correct author, so please do not cite this article until it is updated with the correct authors' names. If you are interested in citing this work please either (a) check back at this url later -- we anticipated that the correct authors' names will be included no later than February 2021, or (b) contact Carly Robinson ([email protected]) directly to see if the paper might be cited on an earlier time frame.


2021 ◽  
Author(s):  
Hunter Gehlbach ◽  
Carly D Robinson

Recently, scholars have noted how several “old school” practices—a host of well-regarded, long-standing scientific norms—in combination, sometimes compromise the credibility of research. In response, other scholarly fields have developed several “open science” norms and practices to address these credibility issues. Against this backdrop, this special issue explores the extent to which and how these norms should be adopted and adapted for educational psychology and education more broadly. Our introductory article contextualizes the special issue’s goals by: overviewing the historical context that led to open science norms (particularly in medicine and psychology); providing a conceptual map to illustrate the interrelationships between various old school as well as open science practices; and then describing educational psychologists’ opportunity to benefit from and contribute to the translation of these norms to novel research contexts. We conclude by previewing the articles in the special issue.


2020 ◽  
Author(s):  
Carly D Robinson

In education, scientific research should play an important role in improving learner outcomes by informing and enhancing policy and practice. However, a substantial gap exists between research and practice in the field. To close this gap, researchers in educational psychology can apply open-science practices to increase the credibility, impact, and equity of their research. In this article, we examine three open-science practices -- open data and code, open access and preprints, and crowdsourcing -- that are well-suited to foster credibility, impact, and equity of research. For each open-science practice, we briefly discuss what the practice is and how it works, its primary benefits, some important limitations and challenges, and a thorny issue related to the practice. PLEASE DO NOT CITE YET:This article is part of a forthcoming journal Special Issue on Open Science in Education and currently under review. Carly Robinson is NOT the correct author, so please do not cite this article until it is updated with the correct authors' names. If you are interested in citing this work please either (a) check back at this url later -- we anticipated that the correct authors' names will be included no later than February 2021, or (b) contact Carly Robinson ([email protected]) directly to see if the paper might be cited on an earlier time frame.


2020 ◽  
Vol 44 (1-2) ◽  
pp. 1-2
Author(s):  
Karsten Boye Rasmussen

Welcome to volume 44 of the IASSIST Quarterly. Here in 2020 we start with a double issue on reproducibility (IQ 44(1-2)). The start of 2020 was in the sign of Corona. Though we are now only in the middle of the year, we can say with confidence that 2020 will be known for the closing down of nearly all public life. From our very own world this included the move of the IASSIST 2020 conference to 2021. The closing down of societies took different forms and this will and should be long debated and investigated, because many civil rights in open society were put on instant standby by governments, with various precautionary measures. Fortunately, many countries are now in the processes of opening up. Hopefully, we are now more careful, keeping socially distant, executing better sanitation, etc. We are also eagerly expectant of science breakthroughs: the vaccine, the better treatment, the cure. But Corona science extends beyond health and biology. Social science in particular has an obligation to make us better prepared to take necessary measures and to uphold democracy.     Social science has always had the reliable issue that you cannot step into the same river twice: Survey data collected at one time will not in a subsequent data collection bring the same results, even with the same panel of respondents. Reproducibility has many more forms than exact data collection, though, and is foundational for open science and an open society. Science needs to be transparent in order to be challenged and improved. Fellow scientists as well as laymen should have the possibility of performing analyses to find whether results can be reproduced. I am therefore very happy to send my thanks to Harrison Dekker and Amy Riegelman for taking the initiative to create this special issue of the IASSIST Quarterly on reproducibility. Harrison Dekker is a data librarian at University of Rhode Island and Amy Riegelman a librarian in social sciences at the University of Minnesota. Together, Amy and Harrison reviewed the papers submitted for their special issue and wrote the introduction in the following pages. In addition to expressing my great appreciation to them, I also want to thank all the authors who submitted papers for this issue. Thanks! Let's keep science open again! Submissions of papers for the IASSIST Quarterly are always very welcome. We welcome input from IASSIST conferences or other conferences and workshops, from local presentations or papers especially written for the IQ. When you are preparing such a presentation, give a thought to turning your one-time presentation into a lasting contribution. Doing that after the event also gives you the opportunity of improving your work after feedback. We encourage you to login or create an author login to https://www.iassistquarterly.com  (our Open Journal System application). We permit authors 'deep links' into the IQ as well as deposition of the paper in your local repository. Chairing a conference session with the purpose of aggregating and integrating papers for a special issue IQ is also much appreciated as the information reaches many more people than the limited number of session participants and will be readily available on the IASSIST Quarterly website at https://www.iassistquarterly.com.  Authors are very welcome to take a look at the instructions and layout: https://www.iassistquarterly.com/index.php/iassist/about/submissions Authors can also contact me directly via e-mail: [email protected].  Should you be interested in compiling a special issue for the IQ as guest editor(s) I will also be delighted to hear from you. Karsten Boye Rasmussen - June 2020


2021 ◽  
Author(s):  
Kathryn R. Wentzel

In this article, I comment on the potential benefits and limitations of open science reforms for improving the transparency and accountability of research, and enhancing the credibility of research findings within communities of policy and practice. Specifically, I discuss the role of replication and reproducibility of research in promoting better quality studies, the identification of generalizable principles, and relevance for practitioners and policymakers. Second, I suggest that greater attention to theory might contribute to the impact of open science practices, and discuss ways in which theory has implications for sampling, measurement and research design. Ambiguities concerning the aims of preregistration and registered reports also are highlighted. In conclusion, I discuss structural roadblocks to open science reform and reflect on the relevance of these reforms for educational psychology.


2020 ◽  
Vol 4 (1) ◽  
pp. 5-14
Author(s):  
Brian A. Eiler ◽  
◽  
Patrick C. Doyle ◽  
Rosemary L. Al-Kire ◽  
Heidi A. Wayment ◽  
...  

This article provides a case study of a student-focused research experience that introduced basic data science skills and their utility for psychological research, providing practical learning experiences for students interested in learning computational social science skills. Skills included programming; acquiring, visualizing, and managing data; performing specialized analyses; and building knowledge about open-science practices.


2021 ◽  
Vol 35 (3) ◽  
pp. 193-214
Author(s):  
Edward Miguel

A decade ago, the term “research transparency” was not on economists' radar screen, but in a few short years a scholarly movement has emerged to bring new open science practices, tools and norms into the mainstream of our discipline. The goal of this article is to lay out the evidence on the adoption of these approaches – in three specific areas: open data, pre-registration and pre-analysis plans, and journal policies – and, more tentatively, begin to assess their impacts on the quality and credibility of economics research. The evidence to date indicates that economics (and related quantitative social science fields) are in a period of rapid transition toward new transparency-enhancing norms. While solid data on the benefits of these practices in economics is still limited, in part due to their relatively recent adoption, there is growing reason to believe that critics' worst fears regarding onerous adoption costs have not been realized. Finally, the article presents a set of frontier questions and potential innovations.


2019 ◽  
Vol 45 ◽  
Author(s):  
Llewellyn E. Van Zyl

Orientation: The purpose of this editorial was to provide an introduction and a general overview of the special issue on Open Science Practices: A Vision for the Future of SAJIP, as hosted in the 45th edition of the South African Journal of Industrial Psychology (SAJIP). Specifically, the aim was to provide a viable, practical and implementable strategy for enhancing the scientific credibility, transparency and international stature of SAJIP.


2020 ◽  
Author(s):  
Carly D Robinson

Extensive debate and criticism of potentially common, yet questionable research practices that lead to biased findings within social and health sciences has emerged over the last decade. These challenges likely apply to educational psychology, though the field has been slow to address them. This article discusses current research norms, strategic solutions proposed under the broad rubric of “Open Science”, and the implications of both for the way research syntheses in educational psychology are conducted and the quality of the information they produce. Strategies such as preregistration, open materials and data, and registered reports stand to address significant threats to the validity of research synthesis. These include challenges associated with publication, dissemination, and selective reporting biases, comprehensive information retrieval, and opportunities to execute unique analytic approaches. A final issue is the development of parallel solutions that address biases specific to the decision making of researchers conducting and evaluating research syntheses. PLEASE DO NOT CITE YET:This article is part of a forthcoming journal Special Issue on Open Science in Education and currently under review. Carly Robinson is NOT the correct author, so please do not cite this article until it is updated with the correct authors' names. If you are interested in citing this work please either (a) check back at this url later -- we anticipated that the correct authors' names will be included no later than February 2021, or (b) contact Carly Robinson ([email protected]) directly to see if the paper might be cited on an earlier time frame.


Sign in / Sign up

Export Citation Format

Share Document