scholarly journals Transparency, Documentation, and Open Science

2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 199-200
Author(s):  
Derek Isaacowitz

Abstract Some GSA journals are especially interested in promoting transparency and open science practices, reflecting how some subdisciplines in aging are moving toward open science practices faster than others. In this talk, I will consider the transparency and open science practices that seem most relevant to aging researchers, such as preregistration, open data, open materials and code, sample size justification and analytic tools for considering null effects. I will also discuss potential challenges to implementing these practices as well as reasons why it is important to do so despite these challenges. The focus will be on pragmatic suggestions for researchers planning and conducting studies now that they hope to publish later.

2020 ◽  
Vol 4 (Supplement_1) ◽  
pp. 858-858
Author(s):  
Derek Isaacowitz

Abstract Some GSA journals are especially interested in promoting transparency and open science practices, reflecting how some subdisciplines in aging are moving toward open science practices faster than others. In this talk, I will consider the transparency and open science practices that seem most relevant to aging researchers, such as preregistration, open data, open materials and code, sample size justification and analytic tools for considering null effects. I will also discuss potential challenges to implementing these practices as well as reasons why it is important to do so despite these challenges. The focus will be on pragmatic suggestions for researchers planning and conducting studies now that they hope to publish later.


2017 ◽  
Author(s):  
Matthew McBee ◽  
Matthew Makel ◽  
Scott J. Peters ◽  
Michael S. Matthews

The ruinous consequences of currently accepted practices in study design and data analysis have revealed themselves in the low reproducibility of findings in fields such as psychology, medicine, biology, and economics. Because giftedness research relies on the same underlying statistical and sociological paradigms, it is likely that our field also suffers from poor reproducibility and unreliable literature. This paper describes open science practices that will increase the rigor and trustworthiness of gifted education’s scientific processes and their associated findings: open data; open materials; and preregistration of hypotheses, design, sample size determination, and statistical analysis plans. Readers are directed to internet resources for facilitating open science.


PLoS ONE ◽  
2021 ◽  
Vol 16 (5) ◽  
pp. e0251268
Author(s):  
Russell T. Warne ◽  
Sam Golightly ◽  
Makai Black

Psychologists have investigated creativity for 70 years, and it is now seen as being an important construct, both scientifically and because of its practical value to society. However, several fundamental unresolved problems persist, including a suitable definition of creativity and the ability of psychometric tests to measure divergent thinking—an important component of creativity—in a way that aligns with theory. It is this latter point that this registered report is designed to address. We propose to administer two divergent thinking tests (the verbal and figural versions of the Torrance Tests of Creative Thinking; TTCT) with an intelligence test (the International Cognitive Ability Resource test; ICAR). We will then subject the subscores from these tests to confirmatory factor analysis to test which of nine theoretically plausible models best fits the data. When this study is completed, we hope to better understand whether the degree to which the TTCT and ICAR measure distinct constructs. This study will be conducted in accordance with all open science practices, including pre-registration, open data and syntax, and open materials (with the exception of copyrighted and confidential test stimuli).


2018 ◽  
Vol 62 (4) ◽  
pp. 374-388 ◽  
Author(s):  
Matthew T. McBee ◽  
Matthew C. Makel ◽  
Scott J. Peters ◽  
Michael S. Matthews

Current practices in study design and data analysis have led to low reproducibility and replicability of findings in fields such as psychology, medicine, biology, and economics. Because gifted education research relies on the same underlying statistical and sociological paradigms, it is likely that it too suffers from these problems. This article discusses the origin of the poor replicability and introduces a set of open science practices that can increase the rigor and trustworthiness of gifted education’s scientific findings: preregistration, open data and open materials, registered reports, and preprints. Readers are directed to Internet resources for facilitating open science. To model these practices, a pre peer-review preprint of this article is available at https://psyarxiv.com/nhuv3/ .


2021 ◽  
Author(s):  
Debi LaPlante ◽  
Eric R. Louderback ◽  
Brett Abarbanel

Scientists across disciplines have begun to implement “open science” principles and practices, which are designed to enhance the quality, transparency, and replicability of scientific research. Yet, studies examining the use of open science practices in social science fields such as psychology and economics show that awareness and use of such practices often is low. In gambling studies research, no studies to date have empirically investigated knowledge of and use of open science practices. In the present study, we collected survey data about awareness and use of open science practices from 86 gambling studies research stakeholders who had attended a major international gambling studies conference in May 2019. We found that—as hypothesized—a minority of gambling research stakeholders reported: 1) either some or extensive experience using open science research practices in general, and 2) either some or regular experience using specific open science practices, including study pre-registration, open materials/code, open data, and pre-print archiving. Most respondents indicated that replication was important for all studies in gambling research, and that genetic, neuroscience, and lab-based game characteristic studies were areas most in need of replication. Our results have important implications for open science education initiatives and for contemporary research methodology in gambling studies.


2020 ◽  
Author(s):  
David Moreau ◽  
Beau Gamble

Psychology researchers are rapidly adopting open science practices, yet clear guidelines on how to apply these practices to meta-analysis remain lacking. In this tutorial, we describe why open science is important in the context of meta-analysis in psychology, and suggest how to adopt the three main components of open science: preregistration, open materials, and open data. We first describe how to make the preregistration as thorough as possible—and how to handle deviations from the plan. We then focus on creating easy-to-read materials (e.g., search syntax, R scripts) to facilitate reproducibility and bolster the impact of a meta-analysis. Finally, we suggest how to organize data (e.g., literature search results, data extracted from studies) that are easy to share, interpret, and update as new studies emerge. For each step of the meta-analysis, we provide example templates, accompanied by brief video tutorials, and show how to integrate these practices into the Open Science Framework (https://osf.io/q8stz/).


2020 ◽  
Vol 36 (3) ◽  
pp. 263-279
Author(s):  
Isabel Steinhardt

Openness in science and education is increasing in importance within the digital knowledge society. So far, less attention has been paid to teaching Open Science in bachelor’s degrees or in qualitative methods. Therefore, the aim of this article is to use a seminar example to explore what Open Science practices can be taught in qualitative research and how digital tools can be involved. The seminar focused on the following practices: Open data practices, the practice of using the free and open source tool “Collaborative online Interpretation, the practice of participating, cooperating, collaborating and contributing through participatory technologies and in social (based) networks. To learn Open Science practices, the students were involved in a qualitative research project about “Use of digital technologies for the study and habitus of students”. The study shows the practices of Open Data are easy to teach, whereas the use of free and open source tools and participatory technologies for collaboration, participation, cooperation and contribution is more difficult. In addition, a cultural shift would have to take place within German universities to promote Open Science practices in general.


2021 ◽  
Author(s):  
Tamara Kalandadze ◽  
Sara Ann Hart

The increasing adoption of open science practices in the last decade has been changing the scientific landscape across fields. However, developmental science has been relatively slow in adopting open science practices. To address this issue, we followed the format of Crüwell et al., (2019) and created summaries and an annotated list of informative and actionable resources discussing ten topics in developmental science: Open science; Reproducibility and replication; Open data, materials and code; Open access; Preregistration; Registered reports; Replication; Incentives; Collaborative developmental science.This article offers researchers and students in developmental science a starting point for understanding how open science intersects with developmental science. After getting familiarized with this article, the developmental scientist should understand the core tenets of open and reproducible developmental science, and feel motivated to start applying open science practices in their workflow.


2020 ◽  
Author(s):  
Denis Cousineau

Born-Open Data experiments are encouraged for better open science practices. To be adopted, Born-Open data practices must be easy to implement. Herein, I introduce a package for E-Prime such that the data files are automatically saved on a GitHub repository. The BornOpenData package for E-Prime works seamlessly and performs the upload as soon as the experiment is finished so that there is no additional steps to perform beyond placing a package call within E-Prime. Because E-Prime files are not standard tab-separated files, I also provide an R function that retrieves the data directly from GitHub into a data frame ready to be analyzed. At this time, there are no standards as to what should constitute an adequate open-access data repository so I propose a few suggestions that any future Born-Open data system could follow for easier use by the research community.


2022 ◽  
Author(s):  
Bermond Scoggins ◽  
Matthew Peter Robertson

The scientific method is predicated on transparency -- yet the pace at which transparent research practices are being adopted by the scientific community is slow. The replication crisis in psychology showed that published findings employing statistical inference are threatened by undetected errors, data manipulation, and data falsification. To mitigate these problems and bolster research credibility, open data and preregistration have increasingly been adopted in the natural and social sciences. While many political science and international relations journals have committed to implementing these reforms, the extent of open science practices is unknown. We bring large-scale text analysis and machine learning classifiers to bear on the question. Using population-level data -- 93,931 articles across the top 160 political science and IR journals between 2010 and 2021 -- we find that approximately 21% of all statistical inference papers have open data, and 5% of all experiments are preregistered. Despite this shortfall, the example of leading journals in the field shows that change is feasible and can be effected quickly.


Sign in / Sign up

Export Citation Format

Share Document