scholarly journals Learning Open Science by doing Open Science. A reflection of a qualitative research project-based seminar

2020 ◽  
Vol 36 (3) ◽  
pp. 263-279
Author(s):  
Isabel Steinhardt

Openness in science and education is increasing in importance within the digital knowledge society. So far, less attention has been paid to teaching Open Science in bachelor’s degrees or in qualitative methods. Therefore, the aim of this article is to use a seminar example to explore what Open Science practices can be taught in qualitative research and how digital tools can be involved. The seminar focused on the following practices: Open data practices, the practice of using the free and open source tool “Collaborative online Interpretation, the practice of participating, cooperating, collaborating and contributing through participatory technologies and in social (based) networks. To learn Open Science practices, the students were involved in a qualitative research project about “Use of digital technologies for the study and habitus of students”. The study shows the practices of Open Data are easy to teach, whereas the use of free and open source tools and participatory technologies for collaboration, participation, cooperation and contribution is more difficult. In addition, a cultural shift would have to take place within German universities to promote Open Science practices in general.

2019 ◽  
Author(s):  
Isabel Steinhardt

Openness in science and education is increasing in importance within the digital knowledge society. So far, less attention has been paid to teaching Open Science in bachelor’s degrees or in qualitative methods. Therefore, the aim of this article is to use a seminar example to explore what Open Science practices can be taught in qualitative research and how digital tools can be involved. The seminar focused on the following practices: Open data practices, the practice of using the free and open source tool “Collaborative online Interpretation, the practice of participating, cooperating, collaborating and contributing through participatory technologies and in social (based) networks. To learn Open Science practices, the students were involved in a qualitative research project about “Use of digital technologies for the study and habitus of students”. The study shows the practices of Open Data are easy to teach, whereas the use of free and open source tools and participatory technologies for collaboration, participation, cooperation and contribution is more difficult. In addition, a cultural shift would have to take place within German universities to promote Open Science practices in general.


2021 ◽  
Author(s):  
Tamara Kalandadze ◽  
Sara Ann Hart

The increasing adoption of open science practices in the last decade has been changing the scientific landscape across fields. However, developmental science has been relatively slow in adopting open science practices. To address this issue, we followed the format of Crüwell et al., (2019) and created summaries and an annotated list of informative and actionable resources discussing ten topics in developmental science: Open science; Reproducibility and replication; Open data, materials and code; Open access; Preregistration; Registered reports; Replication; Incentives; Collaborative developmental science.This article offers researchers and students in developmental science a starting point for understanding how open science intersects with developmental science. After getting familiarized with this article, the developmental scientist should understand the core tenets of open and reproducible developmental science, and feel motivated to start applying open science practices in their workflow.


eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Erin C McKiernan ◽  
Philip E Bourne ◽  
C Titus Brown ◽  
Stuart Buck ◽  
Amye Kenall ◽  
...  

Open access, open data, open source and other open scholarship practices are growing in popularity and necessity. However, widespread adoption of these practices has not yet been achieved. One reason is that researchers are uncertain about how sharing their work will affect their careers. We review literature demonstrating that open research is associated with increases in citations, media attention, potential collaborators, job opportunities and funding opportunities. These findings are evidence that open research practices bring significant benefits to researchers relative to more traditional closed practices.


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Hollen N. Reischer ◽  
Henry R. Cowan

A robust dialogue about the (un)reliability of psychological science findings has emerged in recent years. In response, metascience researchers have developed innovative tools to increase rigor, transparency, and reproducibility, stimulating rapid improvement and adoption of open science practices. However, existing reproducibility guidelines are geared toward purely quantitative study designs. This leaves some ambiguity as to how such guidelines should be implemented in mixed methods (MM) studies, which combine quantitative and qualitative research. Drawing on extant literature, our own experiences, and feedback from 79 self-identified MM researchers, the current paper addresses two main questions: (a) how and to what extent do existing reproducibility guidelines apply to MM study designs; and (b) can existing reproducibility guidelines be improved by incorporating best practices from qualitative research and epistemology? In answer, we offer 10 key recommendations for use within and outside of MM research. Finally, we argue that good science and good ethical practice are mutually reinforcing and lead to meaningful, credible science.


F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 292
Author(s):  
Michael Hewera ◽  
Daniel Hänggi ◽  
Björn Gerlach ◽  
Ulf Dietrich Kahlert

Reports of non-replicable research demand new methods of research data management. Electronic laboratory notebooks (ELNs) are suggested as tools to improve the documentation of research data and make them universally accessible. In a self-guided approach, we introduced the open-source ELN eLabFTW into our lab group and, after using it for a while, think it is a useful tool to overcome hurdles in ELN introduction by providing a combination of properties making it suitable for small preclinical labs, like ours. We set up our instance of eLabFTW, without any further programming needed. Our efforts to embrace open data approach by introducing an ELN fits well with other institutional organized ELN initiatives in academic research.


2020 ◽  
Author(s):  
Denis Cousineau

Born-Open Data experiments are encouraged for better open science practices. To be adopted, Born-Open data practices must be easy to implement. Herein, I introduce a package for E-Prime such that the data files are automatically saved on a GitHub repository. The BornOpenData package for E-Prime works seamlessly and performs the upload as soon as the experiment is finished so that there is no additional steps to perform beyond placing a package call within E-Prime. Because E-Prime files are not standard tab-separated files, I also provide an R function that retrieves the data directly from GitHub into a data frame ready to be analyzed. At this time, there are no standards as to what should constitute an adequate open-access data repository so I propose a few suggestions that any future Born-Open data system could follow for easier use by the research community.


2022 ◽  
Author(s):  
Bermond Scoggins ◽  
Matthew Peter Robertson

The scientific method is predicated on transparency -- yet the pace at which transparent research practices are being adopted by the scientific community is slow. The replication crisis in psychology showed that published findings employing statistical inference are threatened by undetected errors, data manipulation, and data falsification. To mitigate these problems and bolster research credibility, open data and preregistration have increasingly been adopted in the natural and social sciences. While many political science and international relations journals have committed to implementing these reforms, the extent of open science practices is unknown. We bring large-scale text analysis and machine learning classifiers to bear on the question. Using population-level data -- 93,931 articles across the top 160 political science and IR journals between 2010 and 2021 -- we find that approximately 21% of all statistical inference papers have open data, and 5% of all experiments are preregistered. Despite this shortfall, the example of leading journals in the field shows that change is feasible and can be effected quickly.


2018 ◽  
Author(s):  
Gerit Pfuhl ◽  
Jon Grahe

Watch the VIDEO.Recent years have seen a revolution in publishing, and large support for open access publishing. There has been a slower acceptance and transition to other open science principles such as open data, open materials, and preregistration. To accelerate the transition and make open science the new standard, the collaborative replications and education project (CREP; http://osf.io/wfc6u/)) was launched in 2013, hosted on the Open Science Framework (osf.io). OSF is like a preprint, collecting partial data with each individual contributors project. CREP introduces open science at the start of academic research, facilitating student research training in open science and solidifying behavioral science results. The CREP team attempts to achieve this by inviting contributors to replicate one of several replication studies selected for scientific impact and suitability for undergraduates to complete during one academic term. Contributors follow clear protocols with students interacting with a CREP team that reviews the materials and video of the procedure to ensure quality data collection while students are learning science practices and methods. By combining multiple replications from undergraduates across the globe, the findings can be pooled to conduct meta-analysis and so contribute to generalizable and replicable research findings. CREP is careful to not interpret any single result. CREP has recently joined forces with the psychological science accelerator (PsySciAcc), a globally distributed network of psychological laboratories accelerating the accumulation of reliable and generalizable results in the behavioral sciences. The Department of Psychology at UiT is part of the network and has two ongoing CREP studies, maintaining open science practices early on. In this talk, we will present our experiences of conducting transparent replicable research, and experience with preprints from a supervisor and researcher perspective.


2018 ◽  
Author(s):  
Tomislav Hengl ◽  
Ichsani Wheeler ◽  
Robert A MacMillan

Using the term "Open data" has become a bit of a fashion, but using it without clear specifications is misleading i.e. it can be considered just an empty phrase. Probably even worse is the term "Open Science" — can science be NOT open at all? Are we reinventing something that should be obvious from start? This guide tries to clarify some key aspects of Open Data, Open Source Software and Crowdsourcing using examples of projects and business. It aims at helping you understand and appreciate complexity of Open Data, Open Source software and Open Access publications. It was specifically written for producers and users of environmental data, however, the guide will likely be useful to any data producers and user.


2021 ◽  
Author(s):  
Robert Heirene ◽  
Debi LaPlante ◽  
Eric R. Louderback ◽  
Brittany Keen ◽  
Marjan Bakker ◽  
...  

Study preregistration is one of several “open science” practices (e.g., open data, preprints) that researchers use to improve the transparency and rigour of their research. As more researchers adopt preregistration as a regular research practice, examining the nature and content of preregistrations can help identify strengths and weaknesses of current practices. The value of preregistration, in part, relates to the specificity of the study plan and the extent to which investigators adhere to this plan. We identified 53 preregistrations from the gambling studies field meeting our predefined eligibility criteria and scored their level of specificity using a 23-item protocol developed to measure the extent to which a clear and exhaustive preregistration plan restricts various researcher degrees of freedom (RDoF; i.e., the many methodological choices available to researchers when collecting and analysing data, and when reporting their findings). We also scored studies on a 32-item protocol that measured adherence to the preregistered plan in the study manuscript. We found that gambling preregistrations had low specificity levels on most RDoF. However, a comparison with a sample of cross-disciplinary preregistrations (N = 52; Bakker et al., 2020) indicated that gambling preregistrations scored higher on 12 (of 29) items. Thirteen (65%) of the 20 associated published articles or preprints deviated from the protocol without declaring as much (the mean number of undeclared deviations per article was 2.25, SD = 2.34). Overall, while we found improvements in specificity and adherence over time (2017-2020), our findings suggest the purported benefits of preregistration—including increasing transparency and reducing RDoF—are not fully achieved by current practices. Using our findings, we provide 10 practical recommendations that can be used to support and refine preregistration practices.


Sign in / Sign up

Export Citation Format

Share Document