scholarly journals A Call for Open Science in Giftedness Research

2017 ◽  
Author(s):  
Matthew McBee ◽  
Matthew Makel ◽  
Scott J. Peters ◽  
Michael S. Matthews

The ruinous consequences of currently accepted practices in study design and data analysis have revealed themselves in the low reproducibility of findings in fields such as psychology, medicine, biology, and economics. Because giftedness research relies on the same underlying statistical and sociological paradigms, it is likely that our field also suffers from poor reproducibility and unreliable literature. This paper describes open science practices that will increase the rigor and trustworthiness of gifted education’s scientific processes and their associated findings: open data; open materials; and preregistration of hypotheses, design, sample size determination, and statistical analysis plans. Readers are directed to internet resources for facilitating open science.

2018 ◽  
Vol 62 (4) ◽  
pp. 374-388 ◽  
Author(s):  
Matthew T. McBee ◽  
Matthew C. Makel ◽  
Scott J. Peters ◽  
Michael S. Matthews

Current practices in study design and data analysis have led to low reproducibility and replicability of findings in fields such as psychology, medicine, biology, and economics. Because gifted education research relies on the same underlying statistical and sociological paradigms, it is likely that it too suffers from these problems. This article discusses the origin of the poor replicability and introduces a set of open science practices that can increase the rigor and trustworthiness of gifted education’s scientific findings: preregistration, open data and open materials, registered reports, and preprints. Readers are directed to Internet resources for facilitating open science. To model these practices, a pre peer-review preprint of this article is available at https://psyarxiv.com/nhuv3/ .


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 199-200
Author(s):  
Derek Isaacowitz

Abstract Some GSA journals are especially interested in promoting transparency and open science practices, reflecting how some subdisciplines in aging are moving toward open science practices faster than others. In this talk, I will consider the transparency and open science practices that seem most relevant to aging researchers, such as preregistration, open data, open materials and code, sample size justification and analytic tools for considering null effects. I will also discuss potential challenges to implementing these practices as well as reasons why it is important to do so despite these challenges. The focus will be on pragmatic suggestions for researchers planning and conducting studies now that they hope to publish later.


2020 ◽  
Vol 4 (Supplement_1) ◽  
pp. 858-858
Author(s):  
Derek Isaacowitz

Abstract Some GSA journals are especially interested in promoting transparency and open science practices, reflecting how some subdisciplines in aging are moving toward open science practices faster than others. In this talk, I will consider the transparency and open science practices that seem most relevant to aging researchers, such as preregistration, open data, open materials and code, sample size justification and analytic tools for considering null effects. I will also discuss potential challenges to implementing these practices as well as reasons why it is important to do so despite these challenges. The focus will be on pragmatic suggestions for researchers planning and conducting studies now that they hope to publish later.


PLoS ONE ◽  
2021 ◽  
Vol 16 (5) ◽  
pp. e0251268
Author(s):  
Russell T. Warne ◽  
Sam Golightly ◽  
Makai Black

Psychologists have investigated creativity for 70 years, and it is now seen as being an important construct, both scientifically and because of its practical value to society. However, several fundamental unresolved problems persist, including a suitable definition of creativity and the ability of psychometric tests to measure divergent thinking—an important component of creativity—in a way that aligns with theory. It is this latter point that this registered report is designed to address. We propose to administer two divergent thinking tests (the verbal and figural versions of the Torrance Tests of Creative Thinking; TTCT) with an intelligence test (the International Cognitive Ability Resource test; ICAR). We will then subject the subscores from these tests to confirmatory factor analysis to test which of nine theoretically plausible models best fits the data. When this study is completed, we hope to better understand whether the degree to which the TTCT and ICAR measure distinct constructs. This study will be conducted in accordance with all open science practices, including pre-registration, open data and syntax, and open materials (with the exception of copyrighted and confidential test stimuli).


2009 ◽  
Vol 28 (4) ◽  
pp. 679-699 ◽  
Author(s):  
Kaifeng Lu ◽  
Devan V. Mehrotra ◽  
Guanghan Liu

2017 ◽  
Author(s):  
John Kitchener Sakaluk ◽  
Stephen David Short

Sexuality researchers frequently use exploratory factor analysis (EFA), in order to illuminate the distinguishable theoretical constructs assessed by a set of variables. EFA entails a substantive number of analytic decisions to be made with respect to sample size determination, and how factors are extracted, rotated, and retained. The available analytic options, however, are not all equally empirically rigorous. In the present paper, we discuss the commonly available options for conducting EFA, and which constitute best practices for EFA. We also present the results of a methodological review of the analytic options for EFA used by sexuality researchers in over 200 EFAs, published in more than 160 articles and chapters from 1974 to 2014. Our review reveals that best practices for EFA are actually those least frequently used by sexuality researchers. We introduce freely available analytic resources to help make it easier for sexuality researchers to adhere to best practices when conducting EFAs in their own research.


2021 ◽  
Author(s):  
Debi LaPlante ◽  
Eric R. Louderback ◽  
Brett Abarbanel

Scientists across disciplines have begun to implement “open science” principles and practices, which are designed to enhance the quality, transparency, and replicability of scientific research. Yet, studies examining the use of open science practices in social science fields such as psychology and economics show that awareness and use of such practices often is low. In gambling studies research, no studies to date have empirically investigated knowledge of and use of open science practices. In the present study, we collected survey data about awareness and use of open science practices from 86 gambling studies research stakeholders who had attended a major international gambling studies conference in May 2019. We found that—as hypothesized—a minority of gambling research stakeholders reported: 1) either some or extensive experience using open science research practices in general, and 2) either some or regular experience using specific open science practices, including study pre-registration, open materials/code, open data, and pre-print archiving. Most respondents indicated that replication was important for all studies in gambling research, and that genetic, neuroscience, and lab-based game characteristic studies were areas most in need of replication. Our results have important implications for open science education initiatives and for contemporary research methodology in gambling studies.


2020 ◽  
Author(s):  
David Moreau ◽  
Beau Gamble

Psychology researchers are rapidly adopting open science practices, yet clear guidelines on how to apply these practices to meta-analysis remain lacking. In this tutorial, we describe why open science is important in the context of meta-analysis in psychology, and suggest how to adopt the three main components of open science: preregistration, open materials, and open data. We first describe how to make the preregistration as thorough as possible—and how to handle deviations from the plan. We then focus on creating easy-to-read materials (e.g., search syntax, R scripts) to facilitate reproducibility and bolster the impact of a meta-analysis. Finally, we suggest how to organize data (e.g., literature search results, data extracted from studies) that are easy to share, interpret, and update as new studies emerge. For each step of the meta-analysis, we provide example templates, accompanied by brief video tutorials, and show how to integrate these practices into the Open Science Framework (https://osf.io/q8stz/).


Sign in / Sign up

Export Citation Format

Share Document