scholarly journals Operationalizing the Replication Standard: A Case Study of the Data Curation and Verification Workflow for Scholarly Journals

Author(s):  
Thu-Mai Lewis Christian ◽  
Sophia Lafferty-Hess ◽  
William G. Jacoby ◽  
Thomas M. Carsey

In response to widespread concerns about the integrity of research published in scholarly journals, several initiatives have emerged that are promoting research transparency through access to data underlying published scientific findings. Journal editors, in particular, have made a commitment to research transparency by issuing data policies that require authors to submit their data, code, and documentation to data repositories to allow for public access to the data. In the case of the American Journal of Political Science (AJPS) Data Replication Policy, the data also must undergo an independent verification process in which materials are reviewed for quality as a condition of final manuscript publication and acceptance.Aware of the specialized expertise of the data archives, AJPS called upon the Odum Institute Data Archive to provide a data review service that performs data curation and verification of replication datasets. This article presents a case study of the collaboration between AJPS and the Odum Institute Data Archive to develop a workflow that bridges manuscript publication and data review processes. The case study describes the challenges and the successes of the workflow integration, and offers lessons learned that may be applied by other data archives that are considering expanding their services to include data curation and verification services to support reproducible research.

2018 ◽  
Vol 13 (1) ◽  
pp. 114-124 ◽  
Author(s):  
Thu-Mai Lewis Christian ◽  
Sophia Lafferty-Hess ◽  
William G Jacoby ◽  
Thomas Carsey

In response to widespread concerns about the integrity of research published in scholarly journals, several initiatives have emerged that are promoting research transparency through access to data underlying published scientific findings. Journal editors, in particular, have made a commitment to research transparency by issuing data policies that require authors to submit their data, code, and documentation to data repositories to allow for public access to the data. In the case of the American Journal of Political Science (AJPS) Data Replication Policy, the data also must undergo an independent verification process in which materials are reviewed for quality as a condition of final manuscript publication and acceptance. Aware of the specialized expertise of the data archives, AJPS called upon the Odum Institute Data Archive to provide a data review service that performs data curation and verification of replication datasets. This article presents a case study of the collaboration between AJPS and the Odum Institute Data Archive to develop a workflow that bridges manuscript publication and data review processes. The case study describes the challenges and the successes of the workflow integration, and offers lessons learned that may be applied by other data archives that are considering expanding their services to include data curation and verification services to support reproducible research.


2018 ◽  
Vol 31 (3) ◽  
pp. 442-457 ◽  
Author(s):  
Paul De Lange ◽  
Lyn Daff ◽  
Beverley Jackling

Purpose Publishing in scholarly journals is a practical necessity for academics. Put simply, this reality can be described as “publish or perish”. To be treated as a serious contender for tenure and promotion, scholarly research and activities directed towards publication are necessary aspects of faculty life. The purpose of this paper is to provide insights into “dealing” with the editorial review process of publishing from the perspective of a relatively new author. Design/methodology/approach Using the lens of Q and R theory, a case study approach combined with critical reflection provides a documented tour to enable other authors to enhance their understanding of the publication process through including references to associated reviews and correspondence with a journal editor. The review extracts from the editor and authors’ responses are discussed within the context of a theoretical schema and timeline. Findings Drawing from the theoretical schema, the paper identifies 11 lessons learned along the way to publishing, and these are summarised as the 11 commandments of publishing. Research limitations/implications Utilisation of the Q and R theory can assist researchers as they reflect on how to maximise their publication outcomes. Practical implications The 11 commandments provide a practical approach for those wanting to improve their understanding and likelihood of publishing success. Originality/value The originality of this paper is that it considers the publication process from a novice author who subsequently draws on the knowledge of more experienced co-authors. The findings are based on a theoretical schema that is transferable and able to be adopted by others to guide publication outcomes.


2017 ◽  
Vol 36 (1) ◽  
Author(s):  
Johanna Lilja

This report summarises the papers and discussions presented at the Scholarly Journals and Research Data Seminar organised by the Federation of Finnish Learned Societies and the Finnish Association for Scholarly Publishing in February 2017. Stricter policies on storing research data in repositories and opening it are now being implemented. In fact, 27 per cent of research funders now require data archiving, including the Academy of Finland. The seminar brought together funders, researchers and representatives from journals and data archives to discuss how archiving and opening data should be carried out and the role played by journals. The questions asked included: Should journals require their authors to link their text to research data or should they only encourage such action? Should journals guide their authors to use central national or international data archives or should they establish their own separate data repositories, for example in connection with the Finnish national data service IDA?


2011 ◽  
Vol 15 (1) ◽  
Author(s):  
Michael L. Fetters ◽  
Tova Garcia Duby

Faculty development programs are critical to the implementation and support of curriculum innovation. In this case study, the authors present lessons learned from ten years of experience in faculty development programs created to support innovation in technology enhanced learning. Stages of curriculum innovation are matched to stages of faculty development, and important lessons for success as well as current challenges are delineated and discussed.


Author(s):  
Kaye Chalwell ◽  
Therese Cumming

Radical subject acceleration, or moving students through a subject area faster than is typical, including skipping grades, is a widely accepted approach to support students who are gifted and talented. This is done in order to match the student’s cognitive level and learning needs. This case study explored radical subject acceleration for gifted students by focusing on one school’s response to the learning needs of a ten year old mathematically gifted student. It provides insight into the challenges, accommodations and approach to radical subject acceleration in an Australian school. It explored the processes and decisions made to ensure that a gifted student’s learning needs were met and identified salient issues for radical subject acceleration. Lessons learned from this case study may be helpful for schools considering radical acceleration.


i-com ◽  
2021 ◽  
Vol 20 (1) ◽  
pp. 19-32
Author(s):  
Daniel Buschek ◽  
Charlotte Anlauff ◽  
Florian Lachner

Abstract This paper reflects on a case study of a user-centred concept development process for a Machine Learning (ML) based design tool, conducted at an industry partner. The resulting concept uses ML to match graphical user interface elements in sketches on paper to their digital counterparts to create consistent wireframes. A user study (N=20) with a working prototype shows that this concept is preferred by designers, compared to the previous manual procedure. Reflecting on our process and findings we discuss lessons learned for developing ML tools that respect practitioners’ needs and practices.


2021 ◽  
pp. 026732312110283
Author(s):  
Judith Simon ◽  
Gernot Rieder

Ever since the outbreak of the COVID-19 pandemic, questions of whom or what to trust have become paramount. This article examines the public debates surrounding the initial development of the German Corona-Warn-App in 2020 as a case study to analyse such questions at the intersection of trust and trustworthiness in technology development, design and oversight. Providing some insights into the nature and dynamics of trust and trustworthiness, we argue that (a) trust is only desirable and justified if placed well, that is, if directed at those being trustworthy; that (b) trust and trustworthiness come in degrees and have both epistemic and moral components; and that (c) such a normatively demanding understanding of trust excludes technologies as proper objects of trust and requires that trust is directed at socio-technical assemblages consisting of both humans and artefacts. We conclude with some lessons learned from our case study, highlighting the epistemic and moral demands for trustworthy technology development as well as for public debates about such technologies, which ultimately requires attributing epistemic and moral duties to all actors involved.


Author(s):  
Dang Duy Bui ◽  
Kazuhiro Ogata

AbstractThe mutual exclusion protocol invented by Mellor-Crummey and Scott (called MCS protocol) is used to exemplify that state picture designs based on which the state machine graphical animation (SMGA) tool produces graphical animations should be better visualized. Variants of MCS protocol have been used in Java virtual machines and therefore the 2006 Edsger W. Dijkstra Prize in Distributed Computing went to their paper on MCS protocol. The new state picture design of a state machine formalizing MCS protocol is assessed based on Gestalt principles, more specifically proximity principle and similarity principle. We report on a core part of a formal verification case study in which the new state picture design and the SMGA tool largely contributed to the successful completion of the formal proof that MCS protocol enjoys the mutual exclusion property. The lessons learned acquired through our experiments are summarized as two groups of tips. The first group is some new tips on how to make state picture designs. The second one is some tips on how to conjecture state machine characteristics by using the SMGA tool. We also report on one more case study in which the state picture design has been made for the mutual exclusion protocol invented by Anderson (called Anderson protocol) and some characteristics of the protocol have been discovered based on the tips.


Sign in / Sign up

Export Citation Format

Share Document