scholarly journals Accelerating open science: The collaborative replications and education project (CREP)

2018 ◽  
Author(s):  
Gerit Pfuhl ◽  
Jon Grahe

Watch the VIDEO.Recent years have seen a revolution in publishing, and large support for open access publishing. There has been a slower acceptance and transition to other open science principles such as open data, open materials, and preregistration. To accelerate the transition and make open science the new standard, the collaborative replications and education project (CREP; http://osf.io/wfc6u/)) was launched in 2013, hosted on the Open Science Framework (osf.io). OSF is like a preprint, collecting partial data with each individual contributors project. CREP introduces open science at the start of academic research, facilitating student research training in open science and solidifying behavioral science results. The CREP team attempts to achieve this by inviting contributors to replicate one of several replication studies selected for scientific impact and suitability for undergraduates to complete during one academic term. Contributors follow clear protocols with students interacting with a CREP team that reviews the materials and video of the procedure to ensure quality data collection while students are learning science practices and methods. By combining multiple replications from undergraduates across the globe, the findings can be pooled to conduct meta-analysis and so contribute to generalizable and replicable research findings. CREP is careful to not interpret any single result. CREP has recently joined forces with the psychological science accelerator (PsySciAcc), a globally distributed network of psychological laboratories accelerating the accumulation of reliable and generalizable results in the behavioral sciences. The Department of Psychology at UiT is part of the network and has two ongoing CREP studies, maintaining open science practices early on. In this talk, we will present our experiences of conducting transparent replicable research, and experience with preprints from a supervisor and researcher perspective.

2021 ◽  
Author(s):  
Christian Nawroth ◽  
E. Tobias Krause

A significant proportion of research is directly or indirectly supported through public funding. It is therefore imperative to make obtained scientific findings freely accessible and the way this knowledge is generated as transparently as possible. However, the traditional scientific publication system involves a number of obstacles likely to hinder the process of freely available scientific knowledge and transparency by including delays and restricted access in the proliferation of protocols and results, such as pay-walled journals and articles, long reviewing times, publication biases towards novel positive findings, and personal interests. While many tools are available to improve the transparency and accessibility of the scientific process and the subsequent research findings, the most powerful tool available is likely the implementation of Open Science practices. Open Science covers various aspects of the scholarly process, ranging from e.g. Open Access publishing of research articles, to providing Open Data and Protocols, to Open Science Evaluation (open peer review) and Open Science Tools such as Open Source software – with the primary goal of building on, reusing and openly criticizing the (published) body of scientific knowledge. While in certain research fields such as e.g. psychology or ecology, the application of these practices has been assessed and is growing rapidly, their current state and progress in other fields, such as animal science is, to our knowledge, not systematically assessed and implemented. While general academic and societal benefits of Open Science might be apparent (and more or less generalizable across disciplines), we here will further argue that the implementation of Open Science practices will also benefit the field of animal science by a stronger adherence to the 3R principles, to reduce the number of animals in research, refine protocols and methods and replace animals’ studies by animal-free alternatives.


2020 ◽  
Author(s):  
David Moreau ◽  
Beau Gamble

Psychology researchers are rapidly adopting open science practices, yet clear guidelines on how to apply these practices to meta-analysis remain lacking. In this tutorial, we describe why open science is important in the context of meta-analysis in psychology, and suggest how to adopt the three main components of open science: preregistration, open materials, and open data. We first describe how to make the preregistration as thorough as possible—and how to handle deviations from the plan. We then focus on creating easy-to-read materials (e.g., search syntax, R scripts) to facilitate reproducibility and bolster the impact of a meta-analysis. Finally, we suggest how to organize data (e.g., literature search results, data extracted from studies) that are easy to share, interpret, and update as new studies emerge. For each step of the meta-analysis, we provide example templates, accompanied by brief video tutorials, and show how to integrate these practices into the Open Science Framework (https://osf.io/q8stz/).


2020 ◽  
Vol 36 (3) ◽  
pp. 263-279
Author(s):  
Isabel Steinhardt

Openness in science and education is increasing in importance within the digital knowledge society. So far, less attention has been paid to teaching Open Science in bachelor’s degrees or in qualitative methods. Therefore, the aim of this article is to use a seminar example to explore what Open Science practices can be taught in qualitative research and how digital tools can be involved. The seminar focused on the following practices: Open data practices, the practice of using the free and open source tool “Collaborative online Interpretation, the practice of participating, cooperating, collaborating and contributing through participatory technologies and in social (based) networks. To learn Open Science practices, the students were involved in a qualitative research project about “Use of digital technologies for the study and habitus of students”. The study shows the practices of Open Data are easy to teach, whereas the use of free and open source tools and participatory technologies for collaboration, participation, cooperation and contribution is more difficult. In addition, a cultural shift would have to take place within German universities to promote Open Science practices in general.


2021 ◽  
Author(s):  
Tamara Kalandadze ◽  
Sara Ann Hart

The increasing adoption of open science practices in the last decade has been changing the scientific landscape across fields. However, developmental science has been relatively slow in adopting open science practices. To address this issue, we followed the format of Crüwell et al., (2019) and created summaries and an annotated list of informative and actionable resources discussing ten topics in developmental science: Open science; Reproducibility and replication; Open data, materials and code; Open access; Preregistration; Registered reports; Replication; Incentives; Collaborative developmental science.This article offers researchers and students in developmental science a starting point for understanding how open science intersects with developmental science. After getting familiarized with this article, the developmental scientist should understand the core tenets of open and reproducible developmental science, and feel motivated to start applying open science practices in their workflow.


2019 ◽  
Vol 14 (1) ◽  
pp. 180-193
Author(s):  
Anne Sunikka

This paper describes how the Finnish Ministry of Education and Culture launched an initiative on research data management and open data, open access publishing, and open and collaborative ways of working in 2014. Most of the universities and research institutions took part in the collaborative initiative building new tools and training material for the Finnish research needs. Measures taken by one university, Aalto University, are described in detail and analysed, and compared with the activities taking place in other universities. The focus of this paper is in the changing roles of experts at Aalto University, and organisational transformation that offers possibilities to serve academic personnel better. Various ways of building collaboration and arranging services are described, and their benefits and drawbacks are discussed.


2020 ◽  
Vol 51 (1) ◽  
pp. 26-28
Author(s):  
C. Rossel ◽  
L. van Dyck

The movement towards an Open Science is well engaged and irreversible. It includes Open Access publishing, Open Data and Open Collaborations with several new orientations, among which citizen science. Indeed, in the digital era, the way research is performed, its output shared and published is changing significantly, as are the expectations of policy makers and society at large.


F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 292
Author(s):  
Michael Hewera ◽  
Daniel Hänggi ◽  
Björn Gerlach ◽  
Ulf Dietrich Kahlert

Reports of non-replicable research demand new methods of research data management. Electronic laboratory notebooks (ELNs) are suggested as tools to improve the documentation of research data and make them universally accessible. In a self-guided approach, we introduced the open-source ELN eLabFTW into our lab group and, after using it for a while, think it is a useful tool to overcome hurdles in ELN introduction by providing a combination of properties making it suitable for small preclinical labs, like ours. We set up our instance of eLabFTW, without any further programming needed. Our efforts to embrace open data approach by introducing an ELN fits well with other institutional organized ELN initiatives in academic research.


2020 ◽  
Author(s):  
Denis Cousineau

Born-Open Data experiments are encouraged for better open science practices. To be adopted, Born-Open data practices must be easy to implement. Herein, I introduce a package for E-Prime such that the data files are automatically saved on a GitHub repository. The BornOpenData package for E-Prime works seamlessly and performs the upload as soon as the experiment is finished so that there is no additional steps to perform beyond placing a package call within E-Prime. Because E-Prime files are not standard tab-separated files, I also provide an R function that retrieves the data directly from GitHub into a data frame ready to be analyzed. At this time, there are no standards as to what should constitute an adequate open-access data repository so I propose a few suggestions that any future Born-Open data system could follow for easier use by the research community.


2022 ◽  
Author(s):  
Bermond Scoggins ◽  
Matthew Peter Robertson

The scientific method is predicated on transparency -- yet the pace at which transparent research practices are being adopted by the scientific community is slow. The replication crisis in psychology showed that published findings employing statistical inference are threatened by undetected errors, data manipulation, and data falsification. To mitigate these problems and bolster research credibility, open data and preregistration have increasingly been adopted in the natural and social sciences. While many political science and international relations journals have committed to implementing these reforms, the extent of open science practices is unknown. We bring large-scale text analysis and machine learning classifiers to bear on the question. Using population-level data -- 93,931 articles across the top 160 political science and IR journals between 2010 and 2021 -- we find that approximately 21% of all statistical inference papers have open data, and 5% of all experiments are preregistered. Despite this shortfall, the example of leading journals in the field shows that change is feasible and can be effected quickly.


2019 ◽  
Vol 5 (1) ◽  
Author(s):  
Jordan R. Wagge ◽  
Cristina Baciu ◽  
Kasia Banas ◽  
Joel T. Nadler ◽  
Sascha Schwarz ◽  
...  

The present article reports the results of a meta-analysis of nine student replication projects of Elliot et al.’s (2010) findings from Experiment 3, that women were more attracted to photographs of men with red borders (total n = 640). The eight student projects were part of the Collaborative Replication and Education Project (CREP; https://osf.io/wfc6u/), a research crowdsourcing project for undergraduate students. All replications were reviewed by experts to ensure high quality data, and were pre-registered prior to data collection. Results of this meta-analysis showed no effect of red on attractiveness ratings for either perceived attractiveness (mean ratings difference = –0.07, 95% CI [–0.31, 0.16]) or sexual attractiveness (mean ratings difference = –0.06, 95% CI [–0.36, 0.24]); this null result held with and without Elliot et al.’s (2010) data included in analyses. Exploratory analyses examining whether being in a relationship moderated the effect of color on attractiveness ratings also produced null results.


Sign in / Sign up

Export Citation Format

Share Document