scholarly journals Make like a glass frog: In support of increased transparency in herpetology

2021 ◽  
pp. 35-45
Author(s):  
Benjamin Michael Marshall

Across many scientific disciplines, direct replication efforts and meta-analyses have fuelled concerns on the replicability of findings. Ecology and evolution are similarly affected. Investigations into the causes of this lack of replicability have implicated a suite of research practices linked to incentives in the current publishing system. Other fields have taken great strides to counter incentives that can reward obfuscation –chiefly by championing transparency. But how prominent are protransparency (open science) policies in herpetology journals? We use the recently developed Transparency and Openness Promotion (TOP) Factor to assess the transparency promotion of 19 herpetology journals, and compare the TOP scores to broader science. We find promotion of transparent practices currently lacking in many herpetological journals; and encourage authors, students, editors, and publishers to redouble efforts to bring open science practices to herpetology by changing journal policy, peer-review, and personal practice. We promote an array of options –developed and tested in other fields– demonstrated to counter publication bias, boost research uptake, and enable more transparent science, to enrich herpetological research.

2020 ◽  
Author(s):  
Benjamin Michael Marshall ◽  
Colin Strine

Across many scientific disciplines, direct replication efforts and meta-analyses have fuelled concerns on the replicability of findings. Ecology and evolution are similarly affected. Investigations into the causes of this lack of replicability have implicated a suite of research practices linked to incentives in the current publishing system. Other fields have taken great strides to counter incentives that can reward obfuscation –chiefly by championing transparency. But how prominent are pro-transparency (open science) policies in herpetology journals? We use the recently developed Transparency and Openness Promotion (TOP) Factor to assess the transparency promotion of 19 herpetology journals, and compare the TOP scores to broader science. We find promotion of transparent practices currently lacking in many herpetological journals; and encourage authors, students, editors, and publishers to redouble efforts to bring open science practices to herpetology by changing journal policy, peer-review, and personal practice. We promote an array of options –developed and tested in other fields– demonstrated to counter publication bias, boost research uptake, and enable more transparent science, to enrich herpetological research.


Publications ◽  
2019 ◽  
Vol 7 (4) ◽  
pp. 65 ◽  
Author(s):  
Marcel Knöchelmann

Open science refers to both the practices and norms of more open and transparent communication and research in scientific disciplines and the discourse on these practices and norms. There is no such discourse dedicated to the humanities. Though the humanities appear to be less coherent as a cluster of scholarship than the sciences are, they do share unique characteristics which lead to distinct scholarly communication and research practices. A discourse on making these practices more open and transparent needs to take account of these characteristics. The prevalent scientific perspective in the discourse on more open practices does not do so, which confirms that the discourse’s name, open science, indeed excludes the humanities so that talking about open science in the humanities is incoherent. In this paper, I argue that there needs to be a dedicated discourse for more open research and communication practices in the humanities, one that integrates several elements currently fragmented into smaller, unconnected discourses (such as on open access, preprints, or peer review). I discuss three essential elements of open science—preprints, open peer review practices, and liberal open licences—in the realm of the humanities to demonstrate why a dedicated open humanities discourse is required.


2021 ◽  
Author(s):  
Eric R. Louderback ◽  
Sally M Gainsbury ◽  
Robert Heirene ◽  
Karen Amichia ◽  
Alessandra Grossman ◽  
...  

The replication crisis has stimulated researchers around the world to adopt open science research practices intended to reduce publication bias and improve research quality. Open science practices include study pre-registration, open data, open publication, and avoiding methods that can lead to publication bias and low replication rates. Although gambling studies uses similar research methods to behavioral research fields that have struggled with replication, we know little about the uptake of open science research practices in gambling-focused research. We conducted a scoping review of 500 recent (1/1/2016 – 12/1/2019) studies focused on gambling and problem gambling to examine the use of open science and transparent research practices. Our results showed that a small percentage of studies used most practices: whereas 54.6% (95% CI: [50.2, 58.9]) of studies used at least one of nine open science practices, each practice’s prevalence was: 1.6% for pre-registration (95% CI:[0.8, 3.1]), 3.2% for open data (95% CI:[2.0, 5.1]), 0% for open notebook, 35.2% for open access (95% CI:[31.1, 39.5]), 7.8% for open materials (95% CI:[5.8, 10.5]), 1.4% for open code (95% CI:[0.7, 2.9]), and 15.0% for preprint posting (95% CI:[12.1, 18.4]). In all, 6.4% (95% CI:[4.6, 8.9]) used a power analysis and 2.4% (95% CI:[1.4, 4.2]) of the studies were replication studies. Exploratory analyses showed that studies that used any open science practice, and open access in particular, had higher citation counts. We suggest several practical ways to enhance the uptake of open science principles and practices both within gambling studies and in science more broadly.


2021 ◽  
Author(s):  
Jesse Fox ◽  
Katy E Pearce ◽  
Adrienne L Massanari ◽  
Julius Matthew Riles ◽  
Łukasz Szulc ◽  
...  

Abstract The open science (OS) movement has advocated for increased transparency in certain aspects of research. Communication is taking its first steps toward OS as some journals have adopted OS guidelines codified by another discipline. We find this pursuit troubling as OS prioritizes openness while insufficiently addressing essential ethical principles: respect for persons, beneficence, and justice. Some recommended open science practices increase the potential for harm for marginalized participants, communities, and researchers. We elaborate how OS can serve a marginalizing force within academia and the research community, as it overlooks the needs of marginalized scholars and excludes some forms of scholarship. We challenge the current instantiation of OS and propose a divergent agenda for the future of Communication research centered on ethical, inclusive research practices.


2020 ◽  
Vol 125 (2) ◽  
pp. 1033-1051
Author(s):  
Dietmar Wolfram ◽  
Peiling Wang ◽  
Adam Hembree ◽  
Hyoungjoo Park

AbstractOpen peer review (OPR), where review reports and reviewers’ identities are published alongside the articles, represents one of the last aspects of the open science movement to be widely embraced, although its adoption has been growing since the turn of the century. This study provides the first comprehensive investigation of OPR adoption, its early adopters and the implementation approaches used. Current bibliographic databases do not systematically index OPR journals, nor do the OPR journals clearly state their policies on open identities and open reports. Using various methods, we identified 617 OPR journals that published at least one article with open identities or open reports as of 2019 and analyzed their wide-ranging implementations to derive emerging OPR practices. The findings suggest that: (1) there has been a steady growth in OPR adoption since 2001, when 38 journals initially adopted OPR, with more rapid growth since 2017; (2) OPR adoption is most prevalent in medical and scientific disciplines (79.9%); (3) five publishers are responsible for 81% of the identified OPR journals; (4) early adopter publishers have implemented OPR in different ways, resulting in different levels of transparency. Across the variations in OPR implementations, two important factors define the degree of transparency: open identities and open reports. Open identities may include reviewer names and affiliation as well as credentials; open reports may include timestamped review histories consisting of referee reports and author rebuttals or a letter from the editor integrating reviewers’ comments. When and where open reports can be accessed are also important factors indicating the OPR transparency level. Publishers of optional OPR journals should add metric data in their annual status reports.


2020 ◽  
Vol 43 (2) ◽  
pp. 91-107
Author(s):  
Matthew C. Makel ◽  
Kendal N. Smith ◽  
Erin M. Miller ◽  
Scott J. Peters ◽  
Matthew T. McBee

Existing research practices in gifted education have many areas for potential improvement so that they can provide useful, generalizable evidence to various stakeholders. In this article, we first review the field’s current research practices and consider the quality and utility of its research findings. Next, we discuss how open science practices increase the transparency of research so readers can more effectively evaluate its validity. Third, we introduce five large-scale collaborative research models that are being used in other fields and discuss how they could be implemented in gifted education research. Finally, we review potential challenges and limitations to implementing collaborative research models in gifted education. We believe greater use of large-scale collaboration will help the field overcome some of its methodological challenges to help provide more precise and accurate information about gifted education.


2020 ◽  
Vol 25 (1) ◽  
pp. 51-72 ◽  
Author(s):  
Christian Franz Josef Woll ◽  
Felix D. Schönbrodt

Abstract. Recent meta-analyses come to conflicting conclusions about the efficacy of long-term psychoanalytic psychotherapy (LTPP). Our first goal was to reproduce the most recent meta-analysis by Leichsenring, Abbass, Luyten, Hilsenroth, and Rabung (2013) who found evidence for the efficacy of LTPP in the treatment of complex mental disorders. Our replicated effect sizes were in general slightly smaller. Second, we conducted an updated meta-analysis of randomized controlled trials comparing LTPP (lasting for at least 1 year and 40 sessions) to other forms of psychotherapy in the treatment of complex mental disorders. We focused on a transparent research process according to open science standards and applied a series of elaborated meta-analytic procedures to test and control for publication bias. Our updated meta-analysis comprising 191 effect sizes from 14 eligible studies revealed small, statistically significant effect sizes at post-treatment for the outcome domains psychiatric symptoms, target problems, social functioning, and overall effectiveness (Hedges’ g ranging between 0.24 and 0.35). The effect size for the domain personality functioning (0.24) was not significant ( p = .08). No signs for publication bias could be detected. In light of a heterogeneous study set and some methodological shortcomings in the primary studies, these results should be interpreted cautiously. In conclusion, LTPP might be superior to other forms of psychotherapy in the treatment of complex mental disorders. Notably, our effect sizes represent the additional gain of LTPP versus other forms of primarily long-term psychotherapy. In this case, large differences in effect sizes are not to be expected.


PLoS ONE ◽  
2021 ◽  
Vol 16 (6) ◽  
pp. e0244529
Author(s):  
Ksenija Baždarić ◽  
Iva Vrkić ◽  
Evgenia Arh ◽  
Martina Mavrinac ◽  
Maja Gligora Marković ◽  
...  

Attitudes towards open peer review, open data and use of preprints influence scientists’ engagement with those practices. Yet there is a lack of validated questionnaires that measure these attitudes. The goal of our study was to construct and validate such a questionnaire and use it to assess attitudes of Croatian scientists. We first developed a 21-item questionnaire called Attitudes towards Open data sharing, preprinting, and peer-review (ATOPP), which had a reliable four-factor structure, and measured attitudes towards open data, preprint servers, open peer-review and open peer-review in small scientific communities. We then used the ATOPP to explore attitudes of Croatian scientists (n = 541) towards these topics, and to assess the association of their attitudes with their open science practices and demographic information. Overall, Croatian scientists’ attitudes towards these topics were generally neutral, with a median (Md) score of 3.3 out of max 5 on the scale score. We also found no gender (P = 0.995) or field differences (P = 0.523) in their attitudes. However, attitudes of scientist who previously engaged in open peer-review or preprinting were higher than of scientists that did not (Md 3.5 vs. 3.3, P<0.001, and Md 3.6 vs 3.3, P<0.001, respectively). Further research is needed to determine optimal ways of increasing scientists’ attitudes and their open science practices.


2019 ◽  
Vol 19 (1) ◽  
pp. 46-59
Author(s):  
Alexandra Sarafoglou ◽  
Suzanne Hoogeveen ◽  
Dora Matzke ◽  
Eric-Jan Wagenmakers

The current crisis of confidence in psychological science has spurred on field-wide reforms to enhance transparency, reproducibility, and replicability. To solidify these reforms within the scientific community, student courses on open science practices are essential. Here we describe the content of our Research Master course “Good Research Practices” which we have designed and taught at the University of Amsterdam. Supported by Chambers’ recent book The 7 Deadly Sins of Psychology, the course covered topics such as QRPs, the importance of direct and conceptual replication studies, preregistration, and the public sharing of data, code, and analysis plans. We adopted a pedagogical approach that: (a) reduced teacher-centered lectures to a minimum; (b) emphasized practical training on open science practices; and (c) encouraged students to engage in the ongoing discussions in the open science community on social media platforms.


2020 ◽  
Author(s):  
Diana Eugenie Kornbrot

Open Science advocates recommend deposit of stimuli, data and code sufficient to support all assertions in a scientific Ms. Most ‘respectable’ journals and funding bodies have endorsed Open Science. i.e. they ‘talk the talk’. Nevertheless, most published Mss. do not ‘walk the walk’ by following the Open Science guidelines. Professional statisticians, e.g. the America Statistical Association, The Royal Statistical Society provide guidance on inferential statistics reporting that proscribes null-hypothesis statistical tests. This guidance is also widely ignored. The purpose of this Ms. is to increase the proportion of Mss. following open science practices by providing guides to transparent reporting that are easily useable by authors and reviewers. The Ms. comprises the guides themselves, already public, and a rationale as to why recommendations have been chosen, together with suggestions to promote open science practices. The guides are unique in including, in a single document, the three main phases for the conduction of replicable science: planning and execution, Ms. generation and publication; and deposit of supplementary materials. A main aim of the Ms. is to subject the guidance and justifications to peer review.


Sign in / Sign up

Export Citation Format

Share Document