scholarly journals Conducting a Meta-Analysis in the Age of Open Science: Tools, Tips, and Practical Recommendations

2020 ◽  
Author(s):  
David Moreau ◽  
Beau Gamble

Psychology researchers are rapidly adopting open science practices, yet clear guidelines on how to apply these practices to meta-analysis remain lacking. In this tutorial, we describe why open science is important in the context of meta-analysis in psychology, and suggest how to adopt the three main components of open science: preregistration, open materials, and open data. We first describe how to make the preregistration as thorough as possible—and how to handle deviations from the plan. We then focus on creating easy-to-read materials (e.g., search syntax, R scripts) to facilitate reproducibility and bolster the impact of a meta-analysis. Finally, we suggest how to organize data (e.g., literature search results, data extracted from studies) that are easy to share, interpret, and update as new studies emerge. For each step of the meta-analysis, we provide example templates, accompanied by brief video tutorials, and show how to integrate these practices into the Open Science Framework (https://osf.io/q8stz/).

2018 ◽  
Author(s):  
Gerit Pfuhl ◽  
Jon Grahe

Watch the VIDEO.Recent years have seen a revolution in publishing, and large support for open access publishing. There has been a slower acceptance and transition to other open science principles such as open data, open materials, and preregistration. To accelerate the transition and make open science the new standard, the collaborative replications and education project (CREP; http://osf.io/wfc6u/)) was launched in 2013, hosted on the Open Science Framework (osf.io). OSF is like a preprint, collecting partial data with each individual contributors project. CREP introduces open science at the start of academic research, facilitating student research training in open science and solidifying behavioral science results. The CREP team attempts to achieve this by inviting contributors to replicate one of several replication studies selected for scientific impact and suitability for undergraduates to complete during one academic term. Contributors follow clear protocols with students interacting with a CREP team that reviews the materials and video of the procedure to ensure quality data collection while students are learning science practices and methods. By combining multiple replications from undergraduates across the globe, the findings can be pooled to conduct meta-analysis and so contribute to generalizable and replicable research findings. CREP is careful to not interpret any single result. CREP has recently joined forces with the psychological science accelerator (PsySciAcc), a globally distributed network of psychological laboratories accelerating the accumulation of reliable and generalizable results in the behavioral sciences. The Department of Psychology at UiT is part of the network and has two ongoing CREP studies, maintaining open science practices early on. In this talk, we will present our experiences of conducting transparent replicable research, and experience with preprints from a supervisor and researcher perspective.


2021 ◽  
Author(s):  
Robert Heirene ◽  
Debi LaPlante ◽  
Eric R. Louderback ◽  
Brittany Keen ◽  
Marjan Bakker ◽  
...  

Study preregistration is one of several “open science” practices (e.g., open data, preprints) that researchers use to improve the transparency and rigour of their research. As more researchers adopt preregistration as a regular research practice, examining the nature and content of preregistrations can help identify strengths and weaknesses of current practices. The value of preregistration, in part, relates to the specificity of the study plan and the extent to which investigators adhere to this plan. We identified 53 preregistrations from the gambling studies field meeting our predefined eligibility criteria and scored their level of specificity using a 23-item protocol developed to measure the extent to which a clear and exhaustive preregistration plan restricts various researcher degrees of freedom (RDoF; i.e., the many methodological choices available to researchers when collecting and analysing data, and when reporting their findings). We also scored studies on a 32-item protocol that measured adherence to the preregistered plan in the study manuscript. We found that gambling preregistrations had low specificity levels on most RDoF. However, a comparison with a sample of cross-disciplinary preregistrations (N = 52; Bakker et al., 2020) indicated that gambling preregistrations scored higher on 12 (of 29) items. Thirteen (65%) of the 20 associated published articles or preprints deviated from the protocol without declaring as much (the mean number of undeclared deviations per article was 2.25, SD = 2.34). Overall, while we found improvements in specificity and adherence over time (2017-2020), our findings suggest the purported benefits of preregistration—including increasing transparency and reducing RDoF—are not fully achieved by current practices. Using our findings, we provide 10 practical recommendations that can be used to support and refine preregistration practices.


PLoS ONE ◽  
2021 ◽  
Vol 16 (5) ◽  
pp. e0251268
Author(s):  
Russell T. Warne ◽  
Sam Golightly ◽  
Makai Black

Psychologists have investigated creativity for 70 years, and it is now seen as being an important construct, both scientifically and because of its practical value to society. However, several fundamental unresolved problems persist, including a suitable definition of creativity and the ability of psychometric tests to measure divergent thinking—an important component of creativity—in a way that aligns with theory. It is this latter point that this registered report is designed to address. We propose to administer two divergent thinking tests (the verbal and figural versions of the Torrance Tests of Creative Thinking; TTCT) with an intelligence test (the International Cognitive Ability Resource test; ICAR). We will then subject the subscores from these tests to confirmatory factor analysis to test which of nine theoretically plausible models best fits the data. When this study is completed, we hope to better understand whether the degree to which the TTCT and ICAR measure distinct constructs. This study will be conducted in accordance with all open science practices, including pre-registration, open data and syntax, and open materials (with the exception of copyrighted and confidential test stimuli).


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 199-200
Author(s):  
Derek Isaacowitz

Abstract Some GSA journals are especially interested in promoting transparency and open science practices, reflecting how some subdisciplines in aging are moving toward open science practices faster than others. In this talk, I will consider the transparency and open science practices that seem most relevant to aging researchers, such as preregistration, open data, open materials and code, sample size justification and analytic tools for considering null effects. I will also discuss potential challenges to implementing these practices as well as reasons why it is important to do so despite these challenges. The focus will be on pragmatic suggestions for researchers planning and conducting studies now that they hope to publish later.


2018 ◽  
Vol 62 (4) ◽  
pp. 374-388 ◽  
Author(s):  
Matthew T. McBee ◽  
Matthew C. Makel ◽  
Scott J. Peters ◽  
Michael S. Matthews

Current practices in study design and data analysis have led to low reproducibility and replicability of findings in fields such as psychology, medicine, biology, and economics. Because gifted education research relies on the same underlying statistical and sociological paradigms, it is likely that it too suffers from these problems. This article discusses the origin of the poor replicability and introduces a set of open science practices that can increase the rigor and trustworthiness of gifted education’s scientific findings: preregistration, open data and open materials, registered reports, and preprints. Readers are directed to Internet resources for facilitating open science. To model these practices, a pre peer-review preprint of this article is available at https://psyarxiv.com/nhuv3/ .


2021 ◽  
Author(s):  
Debi LaPlante ◽  
Eric R. Louderback ◽  
Brett Abarbanel

Scientists across disciplines have begun to implement “open science” principles and practices, which are designed to enhance the quality, transparency, and replicability of scientific research. Yet, studies examining the use of open science practices in social science fields such as psychology and economics show that awareness and use of such practices often is low. In gambling studies research, no studies to date have empirically investigated knowledge of and use of open science practices. In the present study, we collected survey data about awareness and use of open science practices from 86 gambling studies research stakeholders who had attended a major international gambling studies conference in May 2019. We found that—as hypothesized—a minority of gambling research stakeholders reported: 1) either some or extensive experience using open science research practices in general, and 2) either some or regular experience using specific open science practices, including study pre-registration, open materials/code, open data, and pre-print archiving. Most respondents indicated that replication was important for all studies in gambling research, and that genetic, neuroscience, and lab-based game characteristic studies were areas most in need of replication. Our results have important implications for open science education initiatives and for contemporary research methodology in gambling studies.


2017 ◽  
Author(s):  
Matthew McBee ◽  
Matthew Makel ◽  
Scott J. Peters ◽  
Michael S. Matthews

The ruinous consequences of currently accepted practices in study design and data analysis have revealed themselves in the low reproducibility of findings in fields such as psychology, medicine, biology, and economics. Because giftedness research relies on the same underlying statistical and sociological paradigms, it is likely that our field also suffers from poor reproducibility and unreliable literature. This paper describes open science practices that will increase the rigor and trustworthiness of gifted education’s scientific processes and their associated findings: open data; open materials; and preregistration of hypotheses, design, sample size determination, and statistical analysis plans. Readers are directed to internet resources for facilitating open science.


2020 ◽  
Vol 4 (Supplement_1) ◽  
pp. 858-858
Author(s):  
Derek Isaacowitz

Abstract Some GSA journals are especially interested in promoting transparency and open science practices, reflecting how some subdisciplines in aging are moving toward open science practices faster than others. In this talk, I will consider the transparency and open science practices that seem most relevant to aging researchers, such as preregistration, open data, open materials and code, sample size justification and analytic tools for considering null effects. I will also discuss potential challenges to implementing these practices as well as reasons why it is important to do so despite these challenges. The focus will be on pragmatic suggestions for researchers planning and conducting studies now that they hope to publish later.


2020 ◽  
Vol 36 (3) ◽  
pp. 263-279
Author(s):  
Isabel Steinhardt

Openness in science and education is increasing in importance within the digital knowledge society. So far, less attention has been paid to teaching Open Science in bachelor’s degrees or in qualitative methods. Therefore, the aim of this article is to use a seminar example to explore what Open Science practices can be taught in qualitative research and how digital tools can be involved. The seminar focused on the following practices: Open data practices, the practice of using the free and open source tool “Collaborative online Interpretation, the practice of participating, cooperating, collaborating and contributing through participatory technologies and in social (based) networks. To learn Open Science practices, the students were involved in a qualitative research project about “Use of digital technologies for the study and habitus of students”. The study shows the practices of Open Data are easy to teach, whereas the use of free and open source tools and participatory technologies for collaboration, participation, cooperation and contribution is more difficult. In addition, a cultural shift would have to take place within German universities to promote Open Science practices in general.


2021 ◽  
Author(s):  
Kathryn R. Wentzel

In this article, I comment on the potential benefits and limitations of open science reforms for improving the transparency and accountability of research, and enhancing the credibility of research findings within communities of policy and practice. Specifically, I discuss the role of replication and reproducibility of research in promoting better quality studies, the identification of generalizable principles, and relevance for practitioners and policymakers. Second, I suggest that greater attention to theory might contribute to the impact of open science practices, and discuss ways in which theory has implications for sampling, measurement and research design. Ambiguities concerning the aims of preregistration and registered reports also are highlighted. In conclusion, I discuss structural roadblocks to open science reform and reflect on the relevance of these reforms for educational psychology.


Sign in / Sign up

Export Citation Format

Share Document