scholarly journals ­Questionable Research Practices and Open Science in Quantitative Criminology

Author(s):  
Jason Chin ◽  
Justin T. Pickett ◽  
Simine Vazire ◽  
Alex O. Holcombe
2019 ◽  
Vol 6 (12) ◽  
pp. 190738 ◽  
Author(s):  
Jerome Olsen ◽  
Johanna Mosen ◽  
Martin Voracek ◽  
Erich Kirchler

The replicability of research findings has recently been disputed across multiple scientific disciplines. In constructive reaction, the research culture in psychology is facing fundamental changes, but investigations of research practices that led to these improvements have almost exclusively focused on academic researchers. By contrast, we investigated the statistical reporting quality and selected indicators of questionable research practices (QRPs) in psychology students' master's theses. In a total of 250 theses, we investigated utilization and magnitude of standardized effect sizes, along with statistical power, the consistency and completeness of reported results, and possible indications of p -hacking and further testing. Effect sizes were reported for 36% of focal tests (median r = 0.19), and only a single formal power analysis was reported for sample size determination (median observed power 1 − β = 0.67). Statcheck revealed inconsistent p -values in 18% of cases, while 2% led to decision errors. There were no clear indications of p -hacking or further testing. We discuss our findings in the light of promoting open science standards in teaching and student supervision.


2017 ◽  
Vol 48 (6) ◽  
pp. 365-371 ◽  
Author(s):  
Stefan Stürmer ◽  
Aileen Oeberst ◽  
Roman Trötschel ◽  
Oliver Decker

Abstract. Young researchers of today will shape the field in the future. In light of current debates about social psychology’s research culture, this exploratory survey assessed early-career researchers’ beliefs (N = 88) about the prevalence of questionable research practices (QRPs), potential causes, and open science as a possible solution. While there was relative consensus that outright fraud is an exception, a majority of participants believed that some QRPs are moderately to highly prevalent what they attributed primarily to academic incentive structures. A majority of participants felt that open science is necessary to improve research practice. They indicated to consider some open science recommendations in the future, but they also indicated some reluctance. Limitation and implications of these findings are discussed.


2021 ◽  
Vol 12 ◽  
Author(s):  
Rens van de Schoot ◽  
Sonja D. Winter ◽  
Elian Griffioen ◽  
Stephan Grimmelikhuijsen ◽  
Ingrid Arts ◽  
...  

The popularity and use of Bayesian methods have increased across many research domains. The current article demonstrates how some less familiar Bayesian methods can be used. Specifically, we applied expert elicitation, testing for prior-data conflicts, the Bayesian Truth Serum, and testing for replication effects via Bayes Factors in a series of four studies investigating the use of questionable research practices (QRPs). Scientifically fraudulent or unethical research practices have caused quite a stir in academia and beyond. Improving science starts with educating Ph.D. candidates: the scholars of tomorrow. In four studies concerning 765 Ph.D. candidates, we investigate whether Ph.D. candidates can differentiate between ethical and unethical or even fraudulent research practices. We probed the Ph.D.s’ willingness to publish research from such practices and tested whether this is influenced by (un)ethical behavior pressure from supervisors or peers. Furthermore, 36 academic leaders (deans, vice-deans, and heads of research) were interviewed and asked to predict what Ph.D.s would answer for different vignettes. Our study shows, and replicates, that some Ph.D. candidates are willing to publish results deriving from even blatant fraudulent behavior–data fabrication. Additionally, some academic leaders underestimated this behavior, which is alarming. Academic leaders have to keep in mind that Ph.D. candidates can be under more pressure than they realize and might be susceptible to using QRPs. As an inspiring example and to encourage others to make their Bayesian work reproducible, we published data, annotated scripts, and detailed output on the Open Science Framework (OSF).


2021 ◽  
Author(s):  
Bradley David McAuliff ◽  
Melanie B. Fessinger ◽  
Anthony Perillo ◽  
Jennifer Torkildson Perillo

As the field of psychology and law begins to embrace more transparent and accessible science, many questions arise about what open science actually is and how to do it. In this chapter, we contextualize this reform by examining fundamental concerns about psychological research—irreproducibility and replication failures, false-positive errors, and questionable research practices—that threaten its validity and credibility. Next, we turn to psychology’s response by reviewing the concept of open science and explaining how to implement specific practices—preregistration, registered reports, open materials/data/code, and open access publishing—designed to make research more transparent and accessible. We conclude by weighing the implications of open science for the field of psychology and law, specifically with respect to how we conduct and evaluate research, as well as how we train the next generation of psychological scientists and share scientific findings in applied settings.


2020 ◽  
Author(s):  
Soufian Azouaghe ◽  
Adeyemi Adetula ◽  
Patrick S. Forscher ◽  
Dana Basnight-Brown ◽  
Nihal Ouherrou ◽  
...  

The quality of scientific research is assessed not only by its positive impact on socio-economic development and human well-being, but also by its contribution to the development of valid and reliable scientific knowledge. Thus, researchers regardless of their scientific discipline, are supposed to adopt research practices based on transparency and rigor. However, the history of science and the scientific literature teach us that a part of scientific results is not systematically reproducible (Ioannidis, 2005). This is what is commonly known as the "replication crisis" which concerns the natural sciences as well as the social sciences, of which psychology is no exception.Firstly, we aim to address some aspects of the replication crisis and Questionable Research Practices (QRPs). Secondly, we discuss how we can involve more labs in Africa to take part in the global research process, especially the Psychological Science Accelerator (PSA). For these goals, we will develop a tutorial for the labs in Africa, by highlighting the open science practices. In addition, we emphasize that it is substantial to identify African labs needs and factors that hinder their participating in the PSA, and the support needed from the Western world. Finally, we discuss how to make psychological science more participatory and inclusive.


2021 ◽  
Author(s):  
Bert N Bakker ◽  
Jaidka Kokil ◽  
Timothy Dörr ◽  
Neil Fasching ◽  
Yphtach Lelkes

Abstract Recent contributions have questioned the credibility of quantitative communication research. While questionable research practices (QRPs) are believed to be widespread, evidence for this belief is, primarily, derived from other disciplines. Therefore, it is largely unknown to what extent QRPs are used in quantitative communication research and whether researchers embrace open research practices (ORPs). We surveyed first and corresponding authors of publications in the top-20 journals in communication science. Many researchers report using one or more QRPs. We find widespread pluralistic ignorance: QRPs are generally rejected, but researchers believe they are prevalent. At the same time, we find optimism about the use of open science practices. In all, our study has implications for theories in communication that rely upon a cumulative body of empirical work: these theories are negatively affected by QRPs but can gain credibility if based upon ORPs. We outline an agenda to move forward as a discipline.


2020 ◽  
Vol 54 (22) ◽  
pp. 1365-1371
Author(s):  
Fionn Büttner ◽  
Elaine Toomey ◽  
Shane McClean ◽  
Mark Roe ◽  
Eamonn Delahunt

Questionable research practices (QRPs) are intentional and unintentional practices that can occur when designing, conducting, analysing, and reporting research, producing biased study results. Sport and exercise medicine (SEM) research is vulnerable to the same QRPs that pervade the biomedical and psychological sciences, producing false-positive results and inflated effect sizes. Approximately 90% of biomedical research reports supported study hypotheses, provoking suspicion about the field-wide presence of systematic biases to facilitate study findings that confirm researchers’ expectations. In this education review, we introduce three common QRPs (ie, HARKing, P-hacking and Cherry-picking), perform a cross-sectional study to assess the proportion of original SEM research that reports supported study hypotheses, and draw attention to existing solutions and resources to overcome QRPs that manifest in exploratory research. We hypothesised that ≥ 85% of original SEM research studies would report supported study hypotheses. Two independent assessors systematically identified, screened, included, and extracted study data from original research articles published between 1 January 2019 and 31 May 2019 in the British Journal of Sports Medicine, Sports Medicine, the American Journal of Sports Medicine, and the Journal of Orthopaedic & Sports Physical Therapy. We extracted data relating to whether studies reported that the primary hypothesis was supported or rejected by the results. Study hypotheses, methodologies, and analysis plans were preregistered at the Open Science Framework. One hundred and twenty-nine original research studies reported at least one study hypothesis, of which 106 (82.2%) reported hypotheses that were supported by study results. Of 106 studies reporting that primary hypotheses were supported by study results, 75 (70.8%) studies reported that the primary hypothesis was fully supported by study results. The primary study hypothesis was partially supported by study results in 28 (26.4%) studies. We detail open science practices and resources that aim to safe-guard against QRPs that bely the credibility and replicability of original research findings.


Author(s):  
Toby Prike

AbstractRecent years have seen large changes to research practices within psychology and a variety of other empirical fields in response to the discovery (or rediscovery) of the pervasiveness and potential impact of questionable research practices, coupled with well-publicised failures to replicate published findings. In response to this, and as part of a broader open science movement, a variety of changes to research practice have started to be implemented, such as publicly sharing data, analysis code, and study materials, as well as the preregistration of research questions, study designs, and analysis plans. This chapter outlines the relevance and applicability of these issues to computational modelling, highlighting the importance of good research practices for modelling endeavours, as well as the potential of provenance modelling standards, such as PROV, to help discover and minimise the extent to which modelling is impacted by unreliable research findings from other disciplines.


2021 ◽  
Author(s):  
Jason Chin ◽  
Justin Pickett ◽  
Simine Vazire ◽  
Alex O. Holcombe

Objectives. Questionable research practices (QRPs) lead to incorrect research results and contribute to irreproducibility in science. Researchers and institutions have proposed open science practices (OSPs) to improve the detectability of QRPs and the credibility of science. We examine the prevalence of QRPs and OSPs in criminology, and researchers’ opinions of those practices.Methods. We administered an anonymous survey to authors of articles published in criminology journals. Respondents self-reported their own use of 10 QRPs and 5 OSPs. They also estimated the prevalence of use by others, and reported their attitudes toward the practices. Results. QRPs and OSPs are both common in quantitative criminology, about as common as they are in other fields. Criminologists who responded to our survey support using QRPs in some circumstances, but are even more supportive of using OSPs. We did not detect a significant relationship between methodological training and either QRP or OSP use. Support for QRPs is negatively and significantly associated with support for OSPs. Perceived prevalence estimates for some practices resembled a uniform distribution, suggesting criminologists have little knowledge of the proportion of researchers that engage in certain questionable practices.Conclusions. Most quantitative criminologists in our sample use QRPs, and many use multiple QRPs. The substantial prevalence of QRPs raises questions about the validity and reproducibility of published criminological research. We found promising levels of OSP use, albeit at levels lagging what researchers endorse. The findings thus suggest that additional reforms are needed to decrease QRP use and increase the use of OSPs.


Sign in / Sign up

Export Citation Format

Share Document