scholarly journals Open Research: Examples of good practice, and resources across disciplines

2020 ◽  
Author(s):  
Emily Kate Farran ◽  
Priya Silverstein ◽  
Aminath A. Ameen ◽  
Iliana Misheva ◽  
Camilla Gilmore

Open research is best described as “an umbrella term used to refer to the concepts of openness, transparency, rigor, reproducibility, replicability, and accumulation of knowledge” (Crüwell et al., 2019, p. 3). Although a lot of open research practices have commonly been discussed under the term “open science”, open research applies to all disciplines. If the concept of open research is new to you, it might be difficult for you to determine how you can apply open research practices to your research. The aim of this document is to provide resources and examples of open research practices that are relevant to your discipline. The document lists case studies of open research per discipline, and resources per discipline (organised as: general, open methods, open data, open output and open education).

eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Erin C McKiernan ◽  
Philip E Bourne ◽  
C Titus Brown ◽  
Stuart Buck ◽  
Amye Kenall ◽  
...  

Open access, open data, open source and other open scholarship practices are growing in popularity and necessity. However, widespread adoption of these practices has not yet been achieved. One reason is that researchers are uncertain about how sharing their work will affect their careers. We review literature demonstrating that open research is associated with increases in citations, media attention, potential collaborators, job opportunities and funding opportunities. These findings are evidence that open research practices bring significant benefits to researchers relative to more traditional closed practices.


Author(s):  
Katarzyna Biernacka ◽  
Niels Pinkwart

The relevance of open research data is already acknowledged in many disciplines. Demanded by publishers, funders, and research institutions, the number of published research data increases every day. In learning analytics though, it seems that data are not sufficiently published and re-used. This chapter discusses some of the progress that the learning analytics community has made in shifting towards open practices, and it addresses the barriers that researchers in this discipline have to face. As an introduction, the movement and the term open science is explained. The importance of its principles is demonstrated before the main focus is put on open data. The main emphasis though lies in the question, Why are the advantages of publishing research data not capitalized on in the field of learning analytics? What are the barriers? The authors evaluate them, investigate their causes, and consider some potential ways for development in the future in the form of a toolkit and guidelines.


Publications ◽  
2019 ◽  
Vol 7 (4) ◽  
pp. 65 ◽  
Author(s):  
Marcel Knöchelmann

Open science refers to both the practices and norms of more open and transparent communication and research in scientific disciplines and the discourse on these practices and norms. There is no such discourse dedicated to the humanities. Though the humanities appear to be less coherent as a cluster of scholarship than the sciences are, they do share unique characteristics which lead to distinct scholarly communication and research practices. A discourse on making these practices more open and transparent needs to take account of these characteristics. The prevalent scientific perspective in the discourse on more open practices does not do so, which confirms that the discourse’s name, open science, indeed excludes the humanities so that talking about open science in the humanities is incoherent. In this paper, I argue that there needs to be a dedicated discourse for more open research and communication practices in the humanities, one that integrates several elements currently fragmented into smaller, unconnected discourses (such as on open access, preprints, or peer review). I discuss three essential elements of open science—preprints, open peer review practices, and liberal open licences—in the realm of the humanities to demonstrate why a dedicated open humanities discourse is required.


2016 ◽  
Author(s):  
Krzysztof J. Gorgolewski ◽  
Russell A. Poldrack

AbstractRecent years have seen an increase in alarming signals regarding the lack of replicability in neuroscience, psychology, and other related fields. To avoid a widespread crisis in neuroimaging research and consequent loss of credibility in the public eye, we need to improve how we do science. This article aims to be a practical guide for researchers at any stage of their careers that will help them make their research more reproducible and transparent while minimizing the additional effort that this might require. The guide covers three major topics in open science (data, code, and publications) and offers practical advice as well as highlighting advantages of adopting more open research practices that go beyond improved transparency and reproducibility.


2021 ◽  
Author(s):  
Robert Schulz ◽  
Georg Langen ◽  
Robert Prill ◽  
Michael Cassel ◽  
Tracey Weissgerber

Introduction: While transparent reporting of clinical trials is essential to assess the risk of bias and translate research findings into clinical practice, earlier studies have shown that deficiencies are common. This study examined current clinical trial reporting and transparent research practices in sports medicine and orthopedics. Methods: The sample included clinical trials published in the top 25% of sports medicine and orthopedics journals over eight months. Two independent reviewers assessed pre-registration, open data and criteria related to scientific rigor, the study sample, and data analysis. Results: The sample included 163 clinical trials from 27 journals. While the majority of trials mentioned rigor criteria, essential details were often missing. Sixty percent (confidence interval [CI] 53-68%) of trials reported sample size calculations, but only 32% (CI 25-39%) justified the expected effect size. Few trials indicated the blinding status of all main stakeholders (4%; CI 1-7%). Only 18% (CI 12-24%) included information on randomization type, method, and concealed allocation. Most trials reported participants' sex/gender (95%; CI 92-98%) and information on inclusion and exclusion criteria (78%; CI 72-84%). Only 20% (CI 14-26%) of trials were pre-registered. No trials deposited data in open repositories. Conclusions: These results will aid the sports medicine and orthopedics community in developing tailored interventions to improve reporting. While authors typically mention blinding, randomization and other factors, essential details are often missing. Greater acceptance of open science practices, like pre-registration and open data, is needed. These practices have been widely encouraged, we discuss systemic interventions that may improve clinical trial reporting. Registration: https://doi.org/10.17605/OSF.IO/9648H


2021 ◽  
Author(s):  
Adam H. Sparks ◽  
Emerson del Ponte ◽  
Kaique S. Alves ◽  
Zachary S. L. Foster ◽  
Niklaus J. Grünwald

Abstract Open research practices have been highlighted extensively during the last ten years in many fields of scientific study as essential standards needed to promote transparency and reproducibility of scientific results. Scientific claims can only be evaluated based on how protocols, materials, equipment and methods were described; data were collected and prepared; and, analyses were conducted. Openly sharing protocols, data and computational code is central for current scholarly dissemination and communication, but in many fields, including plant pathology, adoption of these practices has been slow. We randomly selected 300 articles published from 2012 to 2018 across 21 journals representative of the plant pathology discipline and assigned them scores reflecting their openness and reproducibility. We found that most of the articles were not following protocols for open science, and were failing to share data or code in a reproducible way. We also propose that use of open-source tools facilitates reproducible work and analyses benefitting not just readers, but the authors as well. Finally, we also provide ideas and tools to promote open, reproducible research practices among plant pathologists.


2021 ◽  
Vol 3 (1) ◽  
pp. 189-204
Author(s):  
Hua Nie ◽  
Pengcheng Luo ◽  
Ping Fu

Research Data Management (RDM) has become increasingly important for more and more academic institutions. Using the Peking University Open Research Data Repository (PKU-ORDR) project as an example, this paper will review a library-based university-wide open research data repository project and related RDM services implementation process including project kickoff, needs assessment, partnerships establishment, software investigation and selection, software customization, as well as data curation services and training. Through the review, some issues revealed during the stages of the implementation process are also discussed and addressed in the paper such as awareness of research data, demands from data providers and users, data policies and requirements from home institution, requirements from funding agencies and publishers, the collaboration between administrative units and libraries, and concerns from data providers and users. The significance of the study is that the paper shows an example of creating an Open Data repository and RDM services for other Chinese academic libraries planning to implement their RDM services for their home institutions. The authors of the paper have also observed since the PKU-ORDR and RDM services implemented in 2015, the Peking University Library (PKUL) has helped numerous researchers to support the entire research life cycle and enhanced Open Science (OS) practices on campus, as well as impacted the national OS movement in China through various national events and activities hosted by the PKUL.


2021 ◽  
Author(s):  
Bert N Bakker ◽  
Jaidka Kokil ◽  
Timothy Dörr ◽  
Neil Fasching ◽  
Yphtach Lelkes

Abstract Recent contributions have questioned the credibility of quantitative communication research. While questionable research practices (QRPs) are believed to be widespread, evidence for this belief is, primarily, derived from other disciplines. Therefore, it is largely unknown to what extent QRPs are used in quantitative communication research and whether researchers embrace open research practices (ORPs). We surveyed first and corresponding authors of publications in the top-20 journals in communication science. Many researchers report using one or more QRPs. We find widespread pluralistic ignorance: QRPs are generally rejected, but researchers believe they are prevalent. At the same time, we find optimism about the use of open science practices. In all, our study has implications for theories in communication that rely upon a cumulative body of empirical work: these theories are negatively affected by QRPs but can gain credibility if based upon ORPs. We outline an agenda to move forward as a discipline.


2017 ◽  
Author(s):  
Vicky Steeves

An open source database and website that lists and maps women who have shown great leadership in open everything -- open education resources, open source, open access, etc.! This repository holds the source code and data for a list of women leaders in openness! This website contains a searchable, sortable list of women who do work in the field of openness: open access, open science, open scholarship, open source code, open data, open education resources -- anything open. There is also a map available for folks who would like to look for women leaders nearest them -- the hope is that this map makes planning conferences, workshops, and events more convenient. The data comes from April Hathcock’s Google Doc and merge requests to this repository.


2020 ◽  
Author(s):  
Bert N. Bakker ◽  
Kokil Jaidka ◽  
Timothy Dörr ◽  
Neil Fasching ◽  
Yphtach Lelkes

Recent contributions have questioned the credibility of quantitative communication research. While questionable research practices are believed to be widespread, evidence for this claim is primarily derived from other disciplines. Before change in communication research can happen, it is important to document the extent to which QRPs are used and whether researchers are open to the changes proposed by the so-called open science agenda. We conducted a large survey among authors of papers published in the top-20 journals in communication science in the last ten years (N=1039). A non-trivial percent of researchers report using one or more QRPs. While QRPs are generally considered unacceptable, researchers perceive QRPs to be common among their colleagues. At the same time, we find optimism about the use of open science practices in communication research. We end with a series of recommendations outlining what journals, institutions and researchers can do moving forward.


Sign in / Sign up

Export Citation Format

Share Document