scholarly journals Assessing and improving robustness of psychological research findings in four steps.

2021 ◽  
Author(s):  
Michele B. Nuijten

Increasing evidence indicates that many published findings in psychology may be overestimated or even false. An often-heard response to this “replication crisis” is to replicate more: replication studies should weed out false positives over time and increase robustness of psychological science. However, replications take time and money – resources that are often scarce. In this chapter, I propose an efficient alternative strategy: a four-step robustness check that first focuses on verifying reported numbers through reanalysis before replicating studies in a new sample.

2021 ◽  
Author(s):  
Benjamin George Farrar ◽  
Christopher Krupenye ◽  
Alba Motes Rodrigo ◽  
Claudio Tennie ◽  
Julia Fischer ◽  
...  

Replication is an important tool used to test and develop scientific theories. Areas of biomedical and psychological research have experienced a replication crisis, in which many published findings failed to replicate. Following this, many other scientific disciplines have been interested in the robustness of their own findings. This chapter examines replication in primate cognitive studies. First, it discusses the frequency and success of replication studies in primate cognition and explores the challenges researchers face when designing and interpreting replication studies across the wide range of research designs used across the field. Next, it discusses the type of research that can probe the robustness of published findings, especially when replication studies are difficult to perform. The chapter concludes with a discussion of different roles that replication can have in primate cognition research.


2020 ◽  
Vol 24 (4) ◽  
pp. 321-334 ◽  
Author(s):  
Séamus A. Power ◽  
Gabriel Velez

Social psychologists are often criticized for failing to capture the dynamic nature of psychological processes. We present a novel framework to address this problem. The MOVE framework contends that to comprehend complex, contradictory, and divergent patterns of thought, affect, and behavior within changing, real-world contexts, it is necessary to undertake ecologically valid research that is attentive to the lived experiences and meaning-making processes of culturally embedded individuals over time. A focus on meanings, observations, viewpoints, and experiences is essential for social psychological research that holistically captures how people construct, understand, respond, position, and act over time within changing social, economic, and political contexts. To illustrate the utility of our proposition, we draw on classic social psychological studies and multimethod fieldwork during a period of rapid social and political change in Colombia during the peace process (2012–2017). We argue the MOVE framework has the potential to advance psychological understandings of, and contributions to, individuals embedded in real, dynamic social and political contexts. We discuss the implications of this extended social psychological paradigm for advancing psychological science.


2018 ◽  
Author(s):  
Jonathon McPhetres

Concerns about the generalizability, veracity, and relevance of social psychological research often resurface within psychology. While many changes are being implemented to improve the integrity of published research and to clarify the publication record, less attention has been given to the questions of relevance. In this short commentary, I offer my perspective on questions of relevance and present some data from the website Reddit. The data show that people care greatly about psychological research—social psychology studies being among the highest upvoted on the subreddit r/science. However, upvotes on Reddit are unrelated to metrics used by researchers to gauge importance (e.g., impact factor, journal rankings and citations), suggesting a disconnect between what psychologists and lay-audiences may see as relevant. I interpret these data in light of the replication crisis and suggest that the spotlight on our field puts greater importance on the need for reform. Whether we like it or not, people care about, share, and use psychological research in their lives, which means we should ensure that our findings are reported accurately and transparently.


2017 ◽  
Author(s):  
Ziyang Lyu ◽  
Kaiping Peng ◽  
Chuan-Peng Hu

Previous surveys showed that most of students and researchers in psychology misinterpreted P-value and confidence intervals (CIs), yet presenting results in CIs may help them to make better statistical inferences. In this data report, we describe a dataset of 362 valid data from students and researchers in China that replicate these misinterpretations. Part of these data had been reported in [Hu, C.-P., Wang, F., Guo, J., Song, M., Sui, J., & Peng, K. (2016). The replication crisis in psychological research (in Chinese). Advances in Psychological Science, 24(9), 1504–1518 doi:10.3724/SP.J.1042.2016.01504]. This dataset can be used for educational purposes. Also, they can serve as the pilot data for future studies on the relationship between the understanding of P-value/CIs and statistic inference based on P-value/CIs.


2021 ◽  
Author(s):  
Michael Bosnjak ◽  
Christian Fiebach ◽  
David Thomas Mellor ◽  
Stefanie Mueller ◽  
Daryl Brian O'Connor ◽  
...  

Recent years have seen dramatic changes in research practices in psychological science. In particular, preregistration of study plans prior to conducting a study has been identified as an important tool to help increase the transparency of science and to improve the robustness of psychological research findings. This article presents the Psychological Research Preregistration-Quantitative (PRP-QUANT) Template produced by a Joint Psychological Societies Preregistration Task Force consisting of the American Psychological Association (APA), British Psychological Society (BPS) and German Psychological Society (DGPs), supported by the Center for Open Science (COS) and the Leibniz Institute for Psychology (ZPID). The goal of the Task Force was to provide the psychological community with a consensus template for the preregistration of quantitative research in psychology, one with wide coverage and the ability, if necessary, to adapt to specific journals, disciplines and researcher needs. This article covers the structure and use of the PRP-QUANT template, while outlining and discussing the benefits of its use for researchers, authors, funders and other relevant stakeholders. We hope that by introducing this template and by demonstrating the support of preregistration by major academic psychological societies, we will facilitate an increase in preregistration practices and thereby also the further advancement of transparency and knowledge-sharing in the psychological sciences.


2018 ◽  
Author(s):  
Michele B. Nuijten

SUMMARY DOCTORAL DISSERTATION: Psychology is facing a “replication crisis”. Many psychological findings could not be replicated in novel samples, which lead to the growing concern that many published findings are overly optimistic or even false. In this dissertation, we investigated potential indicators of problems in the published psychological literature. In Part I of this dissertation, we looked at inconsistencies in reported statistical results in published psychology papers. To facilitate our research, we developed the free tool statcheck; a “spellchecker” for statistics. In Part II, we investigated bias in published effect sizes. We showed that in the presence of publication bias, the overestimation of effects can become worse if you combine studies. Indeed, in meta-analyses from the social sciences we found strong evidence that published effects are overestimated. These are worrying findings, and it is important to think about concrete solutions to improve the quality of psychological research. Some of the solutions we propose are preregistration, replication, and transparency. We argue that to select the best strategies to improve psychological science, we need research on research: meta-research.


2019 ◽  
Author(s):  
Farid Anvari ◽  
Daniel Lakens

Replication failures of past findings in several scientific disciplines, including psychology, medicine, and experimental economics, have created a ‘crisis of confidence’ among scientists. Psychological science has been at the forefront of tackling these issues, with discussions about replication failures and scientific self-criticisms of questionable research practices (QRPs) increasingly taking place in public forums. How this replicability crisis impacts the public’s trust is a question yet to be answered by research. Whereas some researchers believe that the public’s trust will be positively impacted or maintained, others believe trust will be diminished. Because it is our field of expertise, we focus on trust in psychological science. We performed a study testing how public trust in past and future psychological research would be impacted by being informed about i) replication failures, ii) replication failures and criticisms of QRPs, and iii) replication failures, criticisms of QRPs, and proposed reforms. Results from a mostly European sample (N = 1129) showed that, compared to a control group, whereas trust in past research was reduced when people were informed about the aspects of the replication crisis, trust in future research was maintained except when they were also informed about proposed reforms. Potential explanations are discussed.


2020 ◽  
Author(s):  
Jonas Tebbe ◽  
Emily Humble ◽  
Martin A. Stoffel ◽  
Lisa J. Tewes ◽  
Caroline Müller ◽  
...  

AbstractReplication studies are essential for assessing the validity of previous research findings and for probing their generality. However, it has proven challenging to reproduce the results of ecological and evolutionary studies, partly because of the complexity and lability of many of the phenomena being investigated, but also due to small sample sizes, low statistical power and publication bias. Additionally, replication is often considered too difficult in field settings where many factors are beyond the investigator’s control and where spatial and temporal dependencies may be strong. We investigated the feasibility of reproducing original research findings in the field of chemical ecology by attempting to replicate a previous study by our team on Antarctic fur seals (Arctocephalus gazella). In the original study, skin swabs from 41 mother-offspring pairs from two adjacent breeding colonies on Bird Island, South Georgia, were analysed using gas chromatography-mass spectrometry. Seals from the two colonies differed significantly in their chemical fingerprints, suggesting that colony membership may be chemically encoded, and mothers were also chemically similar to their pups, implying that phenotype matching may be involved in mother-offspring recognition. Here, we generated and analysed comparable chemical data from a non-overlapping sample of 50 mother-offspring pairs from the same two colonies five years later. The original results were corroborated in both hypothesis testing and estimation contexts, with p-values remaining highly significant and effect sizes, standardized between studies by bootstrapping the chemical data over individuals, being of comparable magnitude. We furthermore expanded the geographic coverage of our study to include pups from a total of six colonies around Bird Island. Significant chemical differences were observed in the majority of pairwise comparisons, indicating not only that patterns of colony membership persist over time, but also that chemical signatures are colony-specific in general. Our study systematically confirms and extends our previous findings, while also implying that temporal and spatial heterogeneity need not necessarily negate the reproduction and generalization of ecological research findings.


2021 ◽  
Author(s):  
Anne M. Scheel

Psychology’s replication crisis is typically conceptualised as the insight that the published literature contains a worrying amount of unreplicable, false-positive findings. At the same time, meta-scientific attempts to assess the crisis in more detail have reported substantial difficulties to identify unambiguous definitions of the scientific claims in published articles and to determine how they are connected to the presented evidence. I argue that most claims in the literature are so critically underspecified that attempts to empirically evaluate them are doomed to failure — they are not even wrong. Meta-scientists should beware of the flawed assumption that the psychological literature is a collection of well-defined claims. To move beyond the crisis, psychologists must reconsider and rebuild the conceptual basis of their hypotheses before trying to test them.


1999 ◽  
Vol 4 (4) ◽  
pp. 205-218 ◽  
Author(s):  
David Magnusson

A description of two cases from my time as a school psychologist in the middle of the 1950s forms the background to the following question: Has anything important happened since then in psychological research to help us to a better understanding of how and why individuals think, feel, act, and react as they do in real life and how they develop over time? The studies serve as a background for some general propositions about the nature of the phenomena that concerns us in developmental research, for a summary description of the developments in psychological research over the last 40 years as I see them, and for some suggestions about future directions.


Sign in / Sign up

Export Citation Format

Share Document