scholarly journals Open Science is for Aging Research, Too

2019 ◽  
Vol 3 (4) ◽  
Author(s):  
Derek M Isaacowitz ◽  
Majse Lind

Abstract In response to concerns about the replicability of published research, some disciplines have used open science practices to try to enhance the credibility of published findings. Gerontology has been slow to embrace these changes. We argue that open science is important for aging research, both to reduce questionable research practices that may also be prevalent in the field (such as too many reported significant age differences in the literature, underpowered studies, hypothesizing after the results are known, and lack of belief updating when findings do not support theories), as well as to make research in the field more transparent overall. To ensure the credibility of gerontology research moving forward, we suggest concrete ways to incorporate open science into gerontology research: for example, by using available preregistration templates adaptable to a variety of study designs typical for aging research (even secondary analyses of existing data). Larger sample sizes may be achieved by many-lab collaborations. Though using open science practices may make some aspects of gerontology research more challenging, we believe that gerontology needs open science to ensure credibility now and in the future.

2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Hollen N. Reischer ◽  
Henry R. Cowan

A robust dialogue about the (un)reliability of psychological science findings has emerged in recent years. In response, metascience researchers have developed innovative tools to increase rigor, transparency, and reproducibility, stimulating rapid improvement and adoption of open science practices. However, existing reproducibility guidelines are geared toward purely quantitative study designs. This leaves some ambiguity as to how such guidelines should be implemented in mixed methods (MM) studies, which combine quantitative and qualitative research. Drawing on extant literature, our own experiences, and feedback from 79 self-identified MM researchers, the current paper addresses two main questions: (a) how and to what extent do existing reproducibility guidelines apply to MM study designs; and (b) can existing reproducibility guidelines be improved by incorporating best practices from qualitative research and epistemology? In answer, we offer 10 key recommendations for use within and outside of MM research. Finally, we argue that good science and good ethical practice are mutually reinforcing and lead to meaningful, credible science.


2019 ◽  
Author(s):  
Kendal N. Smith ◽  
Matthew C. Makel

In response to concerns about the credibility of many published research findings, open science reforms such as preregistration, data sharing, and alternative forms of publication are being increasingly adopted across scientific communities. Although journals in giftedness and advanced academics research have already implemented several of these practices, they remain unfamiliar to some researchers. In this informal conversation, Kendal Smith and Matthew Makel discuss how they came to know and use open science practices; open science values; benefits and objections; and their future aspirations for open science practices in gifted education research. Their conversation aims to help make open science practices more understandable and actionable for both early career and established researchers.


2020 ◽  
Author(s):  
Olmo Van den Akker ◽  
Laura Danielle Scherer ◽  
Jelte M. Wicherts ◽  
Sander Koole

So-called “open science practices” seek to improve research transparency and methodological rigor. What do emotion researchers think about these practices? To address this question, we surveyed active emotion researchers (N= 144) in October 2019 about their attitudes toward several open science practices. Overall, the majority of emotion researchers had positive attitudes toward open science practices and expressed a willingness to engage in such practices. Emotion researchers on average believed that replicability would improve by publishing more negative findings, by requiring open data and materials, and by conducting studies with larger sample sizes. Direct replications, multi-lab studies, and preregistration were all seen as beneficial to the replicability of emotion research. Emotion researchers believed that more direct replications would be conducted if replication studies would receive increased funding, more citations, and easier publication in high impact journals. Emotion researchers believed that preregistration would be stimulated by providing researchers with more information about its benefits and more guidance on its effective application. Overall, these findings point to considerable momentum with regard to open science among emotion researchers. This momentum may be leveraged to achieve a more robust emotion science.


2020 ◽  
Author(s):  
Evan Mayo-Wilson ◽  
Sean Grant ◽  
Lauren Supplee

Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based”, clearinghouses evaluate empirical research using published standards of evidence that focus on study design features. Study designs that support causal inferences are necessary but insufficient for intervention evaluations to produce true results. The use of open science practices can improve the probability that evaluations produce true results and increase trust in research. In this study, we examined the degree to which the policies, procedures, and practices of 10 federal evidence clearinghouses consider the transparency, openness, and reproducibility of intervention evaluations. We found that seven clearinghouses consider at least one open science practice: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, materials sharing, and citation standards. Clearinghouse processes and standards could be updated to promote research transparency and reproducibility by reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could improve research quality, increase trustworthiness of evidence used for policy making, and support the evidence ecosystem to adopt open science practices.


2019 ◽  
Vol 30 (2) ◽  
pp. 111-123
Author(s):  
Kendal N. Smith ◽  
Matthew C. Makel

In response to concerns about the credibility of many published research findings, open science reforms such as preregistration, data sharing, and alternative forms of publication are being increasingly adopted across scientific communities. Although journals on giftedness and advanced academic research have already implemented several of these practices, they remain unfamiliar to some researchers. In this informal conversation, Kendal Smith and Matthew Makel discuss how they came to know and use open science practices, the values of open science, benefits and objections, and their future aspirations for open science practices in gifted education research. Their conversation aims to help make open science practices more understandable and actionable for both early career and established researchers.


2021 ◽  
Author(s):  
Bert N Bakker ◽  
Jaidka Kokil ◽  
Timothy Dörr ◽  
Neil Fasching ◽  
Yphtach Lelkes

Abstract Recent contributions have questioned the credibility of quantitative communication research. While questionable research practices (QRPs) are believed to be widespread, evidence for this belief is, primarily, derived from other disciplines. Therefore, it is largely unknown to what extent QRPs are used in quantitative communication research and whether researchers embrace open research practices (ORPs). We surveyed first and corresponding authors of publications in the top-20 journals in communication science. Many researchers report using one or more QRPs. We find widespread pluralistic ignorance: QRPs are generally rejected, but researchers believe they are prevalent. At the same time, we find optimism about the use of open science practices. In all, our study has implications for theories in communication that rely upon a cumulative body of empirical work: these theories are negatively affected by QRPs but can gain credibility if based upon ORPs. We outline an agenda to move forward as a discipline.


2020 ◽  
Author(s):  
Anthony J. Roberson ◽  
Ryan L. Farmer ◽  
Steven Shaw ◽  
Shelley Upton ◽  
Imad Zaheer

Trustworthy scientific evidence is essential if school psychologists are to use evidence-based practices to solve the big problems students, teachers, and schools face. Open science practices promote transparency, accessibility, and robustness of research findings, which increases the trustworthiness of scientific claims. Simply, when researchers, trainers, and practitioners can ‘look under the hood’ of a study, (a) the researchers who conducted the study are likely to be more cautious, (b) reviewers are better able to engage the self-correcting mechanisms of science, and (c) readers have more reason to trust the research findings. We discuss questionable research practices that reduce the trustworthiness of evidence; specific open science practices; applications specific to researchers, trainers, and practitioners in school psychology; and next steps in moving the field toward openness and transparency.


2021 ◽  
Vol 31 (1) ◽  
pp. 1-29
Author(s):  
Sven Vlaeminck

In the field of social sciences and particularly in economics, studies have frequently reported a lack of reproducibility of published research. Most often, this is due to the unavailability of data reproducing the findings of a study. However, over the past years, debates on open science practices and reproducible research have become stronger and louder among research funders, learned societies, and research organisations. Many of these have started to implement data policies to overcome these shortcomings. Against this background, the article asks if there have been changes in the way economics journals handle data and other materials that are crucial to reproduce the findings of empirical articles. For this purpose, all journals listed in the Clarivate Analytics Journal Citation Reports edition for economics have been evaluated for policies on the disclosure of research data. The article describes the characteristics of these data policies and explicates their requirements. Moreover, it compares the current findings with the situation some years ago. The results show significant changes in the way journals handle data in the publication process. Research libraries can use the findings of this study for their advisory activities to best support researchers in submitting and providing data as required by journals.


Author(s):  
Toby Prike

AbstractRecent years have seen large changes to research practices within psychology and a variety of other empirical fields in response to the discovery (or rediscovery) of the pervasiveness and potential impact of questionable research practices, coupled with well-publicised failures to replicate published findings. In response to this, and as part of a broader open science movement, a variety of changes to research practice have started to be implemented, such as publicly sharing data, analysis code, and study materials, as well as the preregistration of research questions, study designs, and analysis plans. This chapter outlines the relevance and applicability of these issues to computational modelling, highlighting the importance of good research practices for modelling endeavours, as well as the potential of provenance modelling standards, such as PROV, to help discover and minimise the extent to which modelling is impacted by unreliable research findings from other disciplines.


2021 ◽  
Author(s):  
Jason Chin ◽  
Justin Pickett ◽  
Simine Vazire ◽  
Alex O. Holcombe

Objectives. Questionable research practices (QRPs) lead to incorrect research results and contribute to irreproducibility in science. Researchers and institutions have proposed open science practices (OSPs) to improve the detectability of QRPs and the credibility of science. We examine the prevalence of QRPs and OSPs in criminology, and researchers’ opinions of those practices.Methods. We administered an anonymous survey to authors of articles published in criminology journals. Respondents self-reported their own use of 10 QRPs and 5 OSPs. They also estimated the prevalence of use by others, and reported their attitudes toward the practices. Results. QRPs and OSPs are both common in quantitative criminology, about as common as they are in other fields. Criminologists who responded to our survey support using QRPs in some circumstances, but are even more supportive of using OSPs. We did not detect a significant relationship between methodological training and either QRP or OSP use. Support for QRPs is negatively and significantly associated with support for OSPs. Perceived prevalence estimates for some practices resembled a uniform distribution, suggesting criminologists have little knowledge of the proportion of researchers that engage in certain questionable practices.Conclusions. Most quantitative criminologists in our sample use QRPs, and many use multiple QRPs. The substantial prevalence of QRPs raises questions about the validity and reproducibility of published criminological research. We found promising levels of OSP use, albeit at levels lagging what researchers endorse. The findings thus suggest that additional reforms are needed to decrease QRP use and increase the use of OSPs.


Sign in / Sign up

Export Citation Format

Share Document