scholarly journals Registered Reports: Genre Evolution and the Research Article

2018 ◽  
Vol 36 (1) ◽  
pp. 38-67 ◽  
Author(s):  
Ashley Rose Mehlenbacher

The research article is a staple genre in the economy of scientific research, and although research articles have received considerable treatment in genre scholarship, little attention has been given to the important development of Registered Reports. Registered Reports are an emerging, hybrid genre that proceeds through a two-stage model of peer review. This article charts the emergence of Registered Reports and explores how this new form intervenes in the evolution of the research article genre by replacing the central topoi of novelty with methodological rigor. Specifically, I investigate this discursive and publishing phenomenon by describing current conversations about challenges in replicating research studies, the rhetorical exigence those conversations create, and how Registered Reports respond to this exigence. Then, to better understand this emerging form, I present an empirical study of the genre itself by closely examining four articles published under the Registered Report model from the journal Royal Society Open Science and then investigating the genre hybridity by examining 32 protocols (Stage 1 Registered Reports) and 77 completed (Stage 2 Registered Reports) from a range of journals in the life and psychological sciences. Findings from this study suggest Registered Reports mark a notable intervention in the research article genre for life and psychological sciences, centering the reporting of science in serious methodological debates.

2018 ◽  
Vol 1 ◽  
Author(s):  
Pavel Stoev

There are three key challenges that need to be addressed by journal publishers nowadays: increasing machine-readability and semantic enrichment of the published content to allow text and data mining, aggregation and re-use; adopting open science principles to expand from publication of mainly research articles to all research objects through the research cycle, and facilitating all of this to authors, reviewers and editors through novel and user-friendly technological solutions. increasing machine-readability and semantic enrichment of the published content to allow text and data mining, aggregation and re-use; adopting open science principles to expand from publication of mainly research articles to all research objects through the research cycle, and facilitating all of this to authors, reviewers and editors through novel and user-friendly technological solutions. ARPHA stands for: Authoring, Reviewing, Publishing, Hosting and Archiving, all in one place. ARPHA is the first publishing platform to support the full life cycle of a manuscript within a single online collaborative environment. The platform consists of two interconnected but independently functioning journal publishing workflows: ARPHA-XML: Entirely XML- and Web-based, collaborative authoring, peer review and publication workflow; ARPHA-DOC: Document-based submission (PDF, or text files), peer review and publication workflow. ARPHA-XML: Entirely XML- and Web-based, collaborative authoring, peer review and publication workflow; ARPHA-DOC: Document-based submission (PDF, or text files), peer review and publication workflow. A full list of services provided by ARPHA is available at: http://arphahub.com/about/services Furthermore, Pensoft has been heavily investing in the technological advancement of its journals. The most significant technologies implemented by Pensoft as demonstrated also by the journal Subterranean Biology in the recent years are: Automatic registrations of reviews at Publons - Publons helps reviewers and editors get recognition for every review they make for the journal; Dimensions - powerful tracker of citations, which provides ranking of given research in a given field; Scopus CiteScore Metrics - interactive tool providing information on journal’s performance; Еxport of published figures & supplementary materials to Biodiversity Literature Repository at ZENODO - increases visibility and traceability of article and sub-article elements; Hypothes.is - tool allowing annotations on selected texts from the published article. Automatic registrations of reviews at Publons - Publons helps reviewers and editors get recognition for every review they make for the journal; Dimensions - powerful tracker of citations, which provides ranking of given research in a given field; Scopus CiteScore Metrics - interactive tool providing information on journal’s performance; Еxport of published figures & supplementary materials to Biodiversity Literature Repository at ZENODO - increases visibility and traceability of article and sub-article elements; Hypothes.is - tool allowing annotations on selected texts from the published article.


F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 253
Author(s):  
Daniel Nüst ◽  
Stephen J. Eglen

The traditional scientific paper falls short of effectively communicating computational research.  To help improve this situation, we propose a system by which the computational workflows underlying research articles are checked. The CODECHECK system uses open infrastructure and tools and can be integrated into review and publication processes in multiple ways. We describe these integrations along multiple dimensions (importance, who, openness, when). In collaboration with academic publishers and conferences, we demonstrate CODECHECK with 25 reproductions of diverse scientific publications. These CODECHECKs show that asking for reproducible workflows during a collaborative review can effectively improve executability. While CODECHECK has clear limitations, it may represent a building block in Open Science and publishing ecosystems for improving the reproducibility, appreciation, and, potentially, the quality of non-textual research artefacts. The CODECHECK website can be accessed here: https://codecheck.org.uk/.


2021 ◽  
Vol 5 ◽  
Author(s):  
Nino Van Sambeek ◽  
Daniel Lakens

Surveys indicate that researchers generally have a positive attitude towards open peer review when this consists of making reviews available alongside published articles. Researchers are more negative about revealing the identity of reviewers. They worry reviewers will be less likely to express criticism if their identity is known to authors. Experiments suggest that reviewers are somewhat less likely to recommend rejection when they are told their identity will be communicated to authors, than when they will remain anonymous. One recent study revealed reviewers in five journals who voluntarily signed their reviews gave more positive recommendations than those who did not sign their reviews. We replicate and extend this finding by analyzing 12010 open reviews in PeerJ and 4188 reviews in the Royal Society Open Science where authors can voluntarily sign their reviews. These results based on behavioral data from real peer reviews across a wide range of scientific disciplines demonstrate convincingly that reviewers’ decision to sign is related to their recommendation. The proportion of signed reviews was higher for more positive recommendations, than for more negative recommendations. We also share all 23649 text-mined reviews as raw data underlying our results that can be re-used by researchers interested in peer review.


F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 253
Author(s):  
Daniel Nüst ◽  
Stephen J. Eglen

The traditional scientific paper falls short of effectively communicating computational research.  To help improve this situation, we propose a system by which the computational workflows underlying research articles are checked. The CODECHECK system uses open infrastructure and tools and can be integrated into review and publication processes in multiple ways. We describe these integrations along multiple dimensions (importance, who, openness, when). In collaboration with academic publishers and conferences, we demonstrate CODECHECK with 25 reproductions of diverse scientific publications. These CODECHECKs show that asking for reproducible workflows during a collaborative review can effectively improve executability. While CODECHECK has clear limitations, it may represent a building block in Open Science and publishing ecosystems for improving the reproducibility, appreciation, and, potentially, the quality of non-textual research artefacts. The CODECHECK website can be accessed here: https://codecheck.org.uk/.


2019 ◽  
Author(s):  
Paul Smaldino ◽  
Matthew Adam Turner ◽  
Pablo Andrés Contreras Kallens

Assessing scientists using exploitable metrics can lead to the degradation of research methods even without any strategic behavior on the part of individuals, via "the natural selection of bad science." Institutional incentives to maximize metrics like publication quantity and impact drive this dynamic. Removing these incentives is necessary, but institutional change is slow. However, recent developments suggest possible solutions with more rapid onsets. These include what we call open science improvements, which can reduce publication bias and improve the efficacy of peer review. In addition, there have been increasing calls for funders to move away from prestige- or innovation-based approaches in favor of lotteries. We investigated whether such changes are likely to improve the reproducibility of science even in the presence of persistent incentives for publication quantity through computational modeling. We found that modified lotteries, which allocate funding randomly among proposals that pass a threshold for methodological rigor, effectively reduce the rate of false discoveries, particularly when paired with open science improvements that increase the publication of negative results and improve the quality of peer review. In the absence of funding that targets rigor, open science improvements can still reduce false discoveries in the published literature but are less likely to improve the overall culture of research practices that underlie those publications.


2019 ◽  
Vol 1 ◽  
Author(s):  
Matthias Filter ◽  
Leonardo Candela ◽  
Laurent Guillier ◽  
Maarten Nauta ◽  
Teodor Georgiev ◽  
...  

This Editorial describes the rationale, focus, scope and technology behind the newly launched, open access, innovative Food Modelling Journal (FMJ). The Journal is designed to publish those outputs of the research cycle that usually precede the publication of the research article, but have their own value and re-usability potential. Such outputs are methods, models, software and data. The Food Modelling Journal is launched by the AGINFRA+ community and is integrated with the AGINFRA+ Virtual Research Environment (VRE) to facilitate and streamline the authoring, peer review and publication of the manuscripts via the ARPHA Publishing Platform.


2019 ◽  
Author(s):  
Nino van Sambeek ◽  
Daniel Lakens

Surveys indicate that researchers generally have a positive attitude towards open peer review when this consists of making reviews available alongside published articles. Researchers are more negative about revealing the identity of reviewers. They worry reviewers will be less likely to express criticism if their identity is known to authors. Experiments suggest that reviewers are somewhat less likely to recommend rejection when they are told their identity will be communicated to authors, than when they will remain anonymous. One recent study revealed reviewers in five journals who voluntarily signed their reviews gave more positive recommendations than those who did not sign their reviews. We replicate and extend this finding by analyzing 12010 open reviews in PeerJ and 4188 reviews in the Royal Society Open Science where authors can voluntarily sign their reviews. These results based on behavioral data from real peer reviews across a wide range of scientific disciplines demonstrate convincingly that reviewers' decision to sign is related to their recommendation. The proportion of signed reviews was higher for more positive recommendations, than for more negative recommendations. We also share all 23649 text-mined reviews as raw data underlying our results that can be re-used by researchers interested in peer review.


Author(s):  
Victor Nuovo

Although the vocation of Christian virtuoso was invented and named by Robert Boyle, Francis Bacon provided the archtype. A Christian virtuoso is an experimental natural philosopher who professes Christianity, who endeavors to unite empiricism and supernatural belief in an intellectual life. In his program for the renewal of the learning Bacon prescribed that the empirical study of nature be the basis of all the sciences, including not only the study of physical things, but of human society, and literature. He insisted that natural causes only be used to explain natural events and proposed not to mix theology with natural philosophy. This became a rule of the Royal Society of London, of which Boyle was a principal founder. Bacon’s rule also had a theological use, to preserve the purity and the divine authority of revelation. In the mind of the Christian virtuoso, nature and divine revelation were separate but complementary sources of truth.


Publications ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 14
Author(s):  
Eirini Delikoura ◽  
Dimitrios Kouis

Recently significant initiatives have been launched for the dissemination of Open Access as part of the Open Science movement. Nevertheless, two other major pillars of Open Science such as Open Research Data (ORD) and Open Peer Review (OPR) are still in an early stage of development among the communities of researchers and stakeholders. The present study sought to unveil the perceptions of a medical and health sciences community about these issues. Through the investigation of researchers` attitudes, valuable conclusions can be drawn, especially in the field of medicine and health sciences, where an explosive growth of scientific publishing exists. A quantitative survey was conducted based on a structured questionnaire, with 179 valid responses. The participants in the survey agreed with the Open Peer Review principles. However, they ignored basic terms like FAIR (Findable, Accessible, Interoperable, and Reusable) and appeared incentivized to permit the exploitation of their data. Regarding Open Peer Review (OPR), participants expressed their agreement, implying their support for a trustworthy evaluation system. Conclusively, researchers need to receive proper training for both Open Research Data principles and Open Peer Review processes which combined with a reformed evaluation system will enable them to take full advantage of the opportunities that arise from the new scholarly publishing and communication landscape.


Sign in / Sign up

Export Citation Format

Share Document