Open Science: A Candid Conversation

2019 ◽  
Vol 30 (2) ◽  
pp. 111-123
Author(s):  
Kendal N. Smith ◽  
Matthew C. Makel

In response to concerns about the credibility of many published research findings, open science reforms such as preregistration, data sharing, and alternative forms of publication are being increasingly adopted across scientific communities. Although journals on giftedness and advanced academic research have already implemented several of these practices, they remain unfamiliar to some researchers. In this informal conversation, Kendal Smith and Matthew Makel discuss how they came to know and use open science practices, the values of open science, benefits and objections, and their future aspirations for open science practices in gifted education research. Their conversation aims to help make open science practices more understandable and actionable for both early career and established researchers.

2019 ◽  
Author(s):  
Kendal N. Smith ◽  
Matthew C. Makel

In response to concerns about the credibility of many published research findings, open science reforms such as preregistration, data sharing, and alternative forms of publication are being increasingly adopted across scientific communities. Although journals in giftedness and advanced academics research have already implemented several of these practices, they remain unfamiliar to some researchers. In this informal conversation, Kendal Smith and Matthew Makel discuss how they came to know and use open science practices; open science values; benefits and objections; and their future aspirations for open science practices in gifted education research. Their conversation aims to help make open science practices more understandable and actionable for both early career and established researchers.


2020 ◽  
Vol 43 (2) ◽  
pp. 91-107
Author(s):  
Matthew C. Makel ◽  
Kendal N. Smith ◽  
Erin M. Miller ◽  
Scott J. Peters ◽  
Matthew T. McBee

Existing research practices in gifted education have many areas for potential improvement so that they can provide useful, generalizable evidence to various stakeholders. In this article, we first review the field’s current research practices and consider the quality and utility of its research findings. Next, we discuss how open science practices increase the transparency of research so readers can more effectively evaluate its validity. Third, we introduce five large-scale collaborative research models that are being used in other fields and discuss how they could be implemented in gifted education research. Finally, we review potential challenges and limitations to implementing collaborative research models in gifted education. We believe greater use of large-scale collaboration will help the field overcome some of its methodological challenges to help provide more precise and accurate information about gifted education.


2021 ◽  
Author(s):  
Kathryn R. Wentzel

In this article, I comment on the potential benefits and limitations of open science reforms for improving the transparency and accountability of research, and enhancing the credibility of research findings within communities of policy and practice. Specifically, I discuss the role of replication and reproducibility of research in promoting better quality studies, the identification of generalizable principles, and relevance for practitioners and policymakers. Second, I suggest that greater attention to theory might contribute to the impact of open science practices, and discuss ways in which theory has implications for sampling, measurement and research design. Ambiguities concerning the aims of preregistration and registered reports also are highlighted. In conclusion, I discuss structural roadblocks to open science reform and reflect on the relevance of these reforms for educational psychology.


2021 ◽  
Vol 03 ◽  
Author(s):  
Danny Kingsley

The nature of the research endeavour is changing rapidly and requires a wide set of skills beyond the research focus. The delivery of aspects of researcher training ‘beyond the bench’ is met by different sections of an institution, including the research office, the media office and the library. In Australia researcher training in open access, research data management and other aspects of open science is primarily offered by librarians. But what training do librarians receive in scholarly communication within their librarianship degrees? For a degree to be offered in librarianship and information science, it must be accredited by the Australian Library and Information Association (ALIA), with a curriculum that is based on ALIA’s lists of skills and attributes. However, these lists do not contain any reference to key open research terms and are almost mutually exclusive with core competencies in scholarly communication as identified by the North American Serials Interest Group and an international Joint Task Force. Over the past decade teaching by academics in universities has been professionalised with courses and qualifications. Those responsible for researcher training within universities and the material that is being offered should also meet an agreed accreditation. This paper is arguing that there is a clear need to develop parallel standards around ‘research practice’ training for PhD students and Early Career Researchers, and those delivering this training should be able to demonstrate their skills against these standards. Models to begin developing accreditation standards are starting to emerge, with the recent launch of the Centre for Academic Research Quality and Improvement in the UK. There are multiple organisations, both grassroots and long-established that would be able to contribute to this project.


2018 ◽  
Author(s):  
Christopher P G Allen ◽  
David Marc Anton Mehler

The movement towards open science is an unavoidable consequence of seemingly pervasive failures to replicate previous research. This transition comes with great benefits but also significant challenges that are likely to afflict those who carry out the research, usually Early Career Researchers (ECRs). Here, we describe key benefits including reputational gains, increased chances of publication and a broader increase in the reliability of research. These are balanced by challenges that we have encountered, and which involve increased costs in terms of flexibility, time and issues with the current incentive structure, all of which seem to affect ECRs acutely. Although there are major obstacles to the early adoption of open science, overall open science practices should benefit both the ECR and improve the quality and plausibility of research. We review three benefits, three challenges and provide suggestions from the perspective of ECRs for moving towards open science practices.


2018 ◽  
Author(s):  
Gerit Pfuhl ◽  
Jon Grahe

Watch the VIDEO.Recent years have seen a revolution in publishing, and large support for open access publishing. There has been a slower acceptance and transition to other open science principles such as open data, open materials, and preregistration. To accelerate the transition and make open science the new standard, the collaborative replications and education project (CREP; http://osf.io/wfc6u/)) was launched in 2013, hosted on the Open Science Framework (osf.io). OSF is like a preprint, collecting partial data with each individual contributors project. CREP introduces open science at the start of academic research, facilitating student research training in open science and solidifying behavioral science results. The CREP team attempts to achieve this by inviting contributors to replicate one of several replication studies selected for scientific impact and suitability for undergraduates to complete during one academic term. Contributors follow clear protocols with students interacting with a CREP team that reviews the materials and video of the procedure to ensure quality data collection while students are learning science practices and methods. By combining multiple replications from undergraduates across the globe, the findings can be pooled to conduct meta-analysis and so contribute to generalizable and replicable research findings. CREP is careful to not interpret any single result. CREP has recently joined forces with the psychological science accelerator (PsySciAcc), a globally distributed network of psychological laboratories accelerating the accumulation of reliable and generalizable results in the behavioral sciences. The Department of Psychology at UiT is part of the network and has two ongoing CREP studies, maintaining open science practices early on. In this talk, we will present our experiences of conducting transparent replicable research, and experience with preprints from a supervisor and researcher perspective.


2020 ◽  
Vol 44 (1-2) ◽  
pp. 1-2
Author(s):  
Harrison Dekker ◽  
Amy Riegelman

As guest editors, we are excited to publish this special double issue of IASSIST Quarterly. The topics of reproducibility, replicability, and transparency have been addressed in past issues of IASSIST Quarterly and at the IASSIST conference, but this double issue is entirely focused on these issues. In recent years, efforts “to improve the credibility of science by advancing transparency, reproducibility, rigor, and ethics in research” have gained momentum in the social sciences (Center for Effective Global Action, 2020). While few question the spirit of the reproducibility and research transparency movement, it faces significant challenges because it goes against the grain of established practice. We believe the data services community is in a unique position to help advance this movement given our data and technical expertise, training and consulting work, international scope, and established role in data management and preservation, and more. As evidence of the movement, several initiatives exist to support research reproducibility infrastructure and data preservation efforts: Center for Open Science (COS) / Open Science Framework (OSF)[i] Berkeley Initiative for Transparency in the Social Sciences (BITSS)[ii] CUrating for REproducibility (CURE)[iii] Project Tier[iv] Data Curation Network[v] UK Reproducibility Network[vi] While many new initiatives have launched in recent years, prior to the now commonly used phrase “reproducibility crisis” and Ioannidis publishing the essay, “Why Most Published Research Findings are False,” we know that the data services community was supporting reproducibility in a variety of ways (e.g., data management, data preservation, metadata standards) in wellestablished consortiums such as Inter-university Consortium for Political and Social Research (ICPSR) (Ioannidis, 2005). The articles in this issue comprise several very important aspects of reproducible research: Identification of barriers to reproducibility and solutions to such barriers Evidence synthesis as related to transparent reporting and reproducibility Reflection on how information professionals, researchers, and librarians perceive the reproducibility crisis and how they can partner to help solve it. The issue begins with “Reproducibility literature analysis” which looks at existing resources and literature to identify barriers to reproducibility and potential solutions. The authors have compiled a comprehensive list of resources with annotations that include definitions of key concepts pertinent to the reproducibility crisis. The next article addresses data reuse from the perspective of a large research university. The authors examine instances of both successful and failed data reuse instances and identify best practices for librarians interested in conducting research involving the common forms of data collected in an academic library. Systematic reviews are a research approach that involves the quantitative and/or qualitative synthesis of data collected through a comprehensive literature review.  “Methods reporting that supports reader confidence for systematic reviews in psychology” looks at the reproducibility of electronic literature searches reported in psychology systematic reviews. A fundamental challenge in reproducing or replicating computational results is the need for researchers to make available the code used in producing these results. But sharing code and having it to run correctly for another user can present significant technical challenges. In “Reproducibility, preservation, and access to research with Reprozip, Reproserver” the authors describe open source software that they are developing to address these challenges.  Taking a published article and attempting to reproduce the results, is an exercise that is sometimes used in academic courses to highlight the inherent difficulty of the process. The final article in this issue, “ReprohackNL 2019: How libraries can promote research reproducibility through community engagement” describes an innovative library-based variation to this exercise.   Harrison Dekker, Data Librarian, University of Rhode Island Amy Riegelman, Social Sciences Librarian, University of Minnesota   References Center for Effective Global Action (2020), About the Berkeley Initiative for Transparency in the Social Sciences. Available at: https://www.bitss.org/about (accessed 23 June 2020). Ioannidis, J.P. (2005) ‘Why most published research findings are false’, PLoS Medicine, 2(8), p. e124.  doi:  https://doi.org/10.1371/journal.pmed.0020124   [i] https://osf.io [ii] https://www.bitss.org/ [iii] http://cure.web.unc.edu [iv] https://www.projecttier.org/ [v] https://datacurationnetwork.org/ [vi] https://ukrn.org


2021 ◽  
Author(s):  
Denys Wheatley

Knowing how to prepare, write and publish high-quality research papers can be challenging for scientists at all stages of their career. This manual guides readers through successfully framing and presenting research findings, as well as the processes involved in publishing in learned journals. It draws on the author's wealth of practical experience, from working in academic research for over 40 years and teaching scientific writing in over 20 countries, to gaining insights as a journal editor. Well-written and logical, it provides clear step-by-step instructions to enable readers to become more effective at writing articles, and navigating difficulties related to journal submission, the review process, editing and publication. It comprehensively covers themes such as publication ethics, along with current topics including Open Access publishing and pre-print servers. This is a useful, user-friendly guide for graduate students, early career scientists, and more experienced researchers, particularly in the life and medical sciences.


Author(s):  
Nigel Gilles Yoccoz

Watch the VIDEO.There is a widespread discussion around a scientific crisis, resulting from a lack of reproducibility of published scientific studies. This was exemplified by Ioannidis’ 2005 paper “Why most published research findings are false” or the 2015 Open Science Collaboration study assessing reproducibility of psychological science. An often-cited reason for this reproducibility crisis is a fundamental misunderstanding of what statistical methods, and in particular P-values, can achieve. In the context of studies of ecology and evolution, I will show how 1) the pressure for publishing “novel” results, 2) what Gelman has called the “garden of forking paths”, i.e. the fact that published analyses represent only one out of many possible analyses, and 3) the often fruitless dichotomy between a null and alternative hypotheses, has led to the present situation. While scientific progress is dependent of major breakthroughs, we also need to find a better balance between confirmatory research – understanding how known effects vary in size according to the context – and exploratory, non-incremental research – finding new effects.


Author(s):  
Lonni Besançon ◽  
Nathan Peiffer-Smadja ◽  
Corentin Segalas ◽  
Haiting Jiang ◽  
Paola Masuzzo ◽  
...  

AbstractIn the last decade Open Science principles have been successfully advocated for and are being slowly adopted in different research communities. In response to the COVID-19 pandemic many publishers and researchers have sped up their adoption of Open Science practices, sometimes embracing them fully and sometimes partially or in a sub-optimal manner. In this article, we express concerns about the violation of some of the Open Science principles and its potential impact on the quality of research output. We provide evidence of the misuses of these principles at different stages of the scientific process. We call for a wider adoption of Open Science practices in the hope that this work will encourage a broader endorsement of Open Science principles and serve as a reminder that science should always be a rigorous process, reliable and transparent, especially in the context of a pandemic where research findings are being translated into practice even more rapidly. We provide all data and scripts at https://osf.io/renxy/.


Sign in / Sign up

Export Citation Format

Share Document