scholarly journals The Psychological Science Accelerator: Advancing Psychology through a Distributed Collaborative Network

2018 ◽  
Author(s):  
Gerit Pfuhl

Concerns have been growing about the veracity of psychological findings. Many findings in psychological science are based on studies with insufficient statistical power and non-representative samples, or may otherwise be limited to specific, ungeneralizable settings or populations. Large-scale collaboration, in which one or more research projects are conducted across multiple lab sites, offers a pragmatic solution to these and other current methodological challenges. The Psychological Science Accelerator (PSA) is a distributed network of laboratories designed to enable and support crowdsourced research projects. The PSA’s mission is to accelerate the accumulation of reliable and generalizable evidence in psychological science. Here, we describe the background, structure, principles, procedures, benefits, and challenges of the PSA. In contrast to other crowdsourced research networks, the PSA is ongoing (as opposed to time-limited), efficient (in terms of re-using structures and principles for different projects), decentralized, diverse (in terms of participants and researchers), and inclusive (of proposals, contributions, and other relevant input from anyone inside or outside of the network). The PSA and other approaches to crowdsourced psychological science will advance our understanding of mental processes and behaviors by enabling rigorous research and systematically examining its generalizability.

2018 ◽  
Vol 1 (4) ◽  
pp. 501-515 ◽  
Author(s):  
Hannah Moshontz ◽  
Lorne Campbell ◽  
Charles R. Ebersole ◽  
Hans IJzerman ◽  
Heather L. Urry ◽  
...  

Concerns about the veracity of psychological research have been growing. Many findings in psychological science are based on studies with insufficient statistical power and nonrepresentative samples, or may otherwise be limited to specific, ungeneralizable settings or populations. Crowdsourced research, a type of large-scale collaboration in which one or more research projects are conducted across multiple lab sites, offers a pragmatic solution to these and other current methodological challenges. The Psychological Science Accelerator (PSA) is a distributed network of laboratories designed to enable and support crowdsourced research projects. These projects can focus on novel research questions or replicate prior research in large, diverse samples. The PSA’s mission is to accelerate the accumulation of reliable and generalizable evidence in psychological science. Here, we describe the background, structure, principles, procedures, benefits, and challenges of the PSA. In contrast to other crowdsourced research networks, the PSA is ongoing (as opposed to time limited), efficient (in that structures and principles are reused for different projects), decentralized, diverse (in both subjects and researchers), and inclusive (of proposals, contributions, and other relevant input from anyone inside or outside the network). The PSA and other approaches to crowdsourced psychological science will advance understanding of mental processes and behaviors by enabling rigorous research and systematic examination of its generalizability.


2018 ◽  
Author(s):  
Hannah Moshontz ◽  
Lorne Campbell ◽  
Charles R. Ebersole ◽  
Hans IJzerman ◽  
Heather L. Urry ◽  
...  

Concerns have been growing about the veracity of psychological research. Many findings in psychological science are based on studies with insufficient statistical power and nonrepresentative samples, or may otherwise be limited to specific, ungeneralizable settings or populations. Crowdsourced research, a type of large-scale collaboration in which one or more research projects are conducted across multiple lab sites, offers a pragmatic solution to these and other current methodological challenges. The Psychological Science Accelerator (PSA) is a distributed network of laboratories designed to enable and support crowdsourced research projects. These projects can focus on novel research questions, or attempt to replicate prior research, in large, diverse samples. The PSA’s mission is to accelerate the accumulation of reliable and generalizable evidence in psychological science. Here, we describe the background, structure, principles, procedures, benefits, and challenges of the PSA. In contrast to other crowdsourced research networks, the PSA is ongoing (as opposed to time-limited), efficient (in terms of re-using structures and principles for different projects), decentralized, diverse (in terms of participants and researchers), and inclusive (of proposals, contributions, and other relevant input from anyone inside or outside of the network). The PSA and other approaches to crowdsourced psychological science will advance our understanding of mental processes and behaviors by enabling rigorous research and systematically examining its generalizability.


2020 ◽  
Author(s):  
Julie Beshears ◽  
Biljana Gjoneska ◽  
Kathleen Schmidt ◽  
Gerit Pfuhl ◽  
Toni Saari ◽  
...  

Recent methodological reforms have succeeded in improving the rigor, accessibility, and transparency of psychological science, but these advances have not successfully proliferated certain subfields, including clinical psychology. Large-scale, crowdsourced collaborations offer clinical psychological scientists a way to conduct rigorous research on a scale not otherwise accessible to most researchers. The Psychological Science Accelerator (PSA) is an international collaborative network of psychological scientists that facilitates rigorous and generalizable research. In this chapter, we describe how the PSA can help clinical psychologists and clinical psychological science more broadly.


2020 ◽  
Author(s):  
Joshua Conrad Jackson ◽  
Joseph Watts ◽  
Johann-Mattis List ◽  
Ryan Drabble ◽  
Kristen Lindquist

Humans have been using language for thousands of years, but psychologists seldom consider what natural language can tell us about the mind. Here we propose that language offers a unique window into human cognition. After briefly summarizing the legacy of language analyses in psychological science, we show how methodological advances have made these analyses more feasible and insightful than ever before. In particular, we describe how two forms of language analysis—comparative linguistics and natural language processing—are already contributing to how we understand emotion, creativity, and religion, and overcoming methodological obstacles related to statistical power and culturally diverse samples. We summarize resources for learning both of these methods, and highlight the best way to combine language analysis techniques with behavioral paradigms. Applying language analysis to large-scale and cross-cultural datasets promises to provide major breakthroughs in psychological science.


2020 ◽  
Author(s):  
Joshua Conrad Jackson ◽  
Joseph Watts ◽  
Johann-Mattis List ◽  
Curtis Puryear ◽  
Ryan Drabble ◽  
...  

Humans have been using language for thousands of years, but psychologists seldom consider what natural language can tell us about the mind. Here we propose that language offers a unique window into human cognition. After briefly summarizing the legacy of language analyses in psychological science, we show how methodological advances have made these analyses more feasible and insightful than ever before. In particular, we describe how two forms of language analysis—comparative linguistics and natural language processing—are already contributing to how we understand emotion, creativity, and religion, and overcoming methodological obstacles related to statistical power and culturally diverse samples. We summarize resources for learning both of these methods, and highlight the best way to combine language analysis techniques with behavioral paradigms. Applying language analysis to large-scale and cross-cultural datasets promises to provide major breakthroughs in psychological science.


2020 ◽  
Vol 8 (1) ◽  
pp. 25-29 ◽  
Author(s):  
Matthew H. Goldberg ◽  
Sander van der Linden

In a large-scale replication effort, Klein et al. (2018, https://doi.org/10.1177/2515245918810225) investigate the variation in replicability and effect size across many different samples and settings. The authors concluded that, for any given effect being studied, heterogeneity across samples and settings does not explain failures to replicate. In the current commentary, we argue that the heterogeneity observed indeed has implications for replication failures, as well as for statistical power and theory development. We argue that psychological scientific research questions should be contextualized—considering how historical, political, or cultural circumstances might affect study results. We discuss how a perspectivist approach to psychological science is a fruitful way for designing research that aims to explain effect size heterogeneity.


2021 ◽  
pp. 174569162110048
Author(s):  
Joshua Conrad Jackson ◽  
Joseph Watts ◽  
Johann-Mattis List ◽  
Curtis Puryear ◽  
Ryan Drabble ◽  
...  

Humans have been using language for millennia but have only just begun to scratch the surface of what natural language can reveal about the mind. Here we propose that language offers a unique window into psychology. After briefly summarizing the legacy of language analyses in psychological science, we show how methodological advances have made these analyses more feasible and insightful than ever before. In particular, we describe how two forms of language analysis—natural-language processing and comparative linguistics—are contributing to how we understand topics as diverse as emotion, creativity, and religion and overcoming obstacles related to statistical power and culturally diverse samples. We summarize resources for learning both of these methods and highlight the best way to combine language analysis with more traditional psychological paradigms. Applying language analysis to large-scale and cross-cultural datasets promises to provide major breakthroughs in psychological science.


2017 ◽  
Author(s):  
Rick Owen Gilmore ◽  
Michele Diaz ◽  
Brad Wyble ◽  
Tal Yarkoni

Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low, and has not improved recently; software errors in common analysis tools are common, and can go undetected for many years; and, a few large scale studies notwithstanding, open sharing of data, code, and materials remains the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflects this new sensibility. We review evidence that the field has begun to embrace new open research practices, and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery.tr


2019 ◽  
Author(s):  
Patrick S. Forscher ◽  
Balazs Aczel ◽  
Christopher R. Chartier ◽  
Erica D. Musser ◽  
Kai Tobias Horstmann ◽  
...  

We describe the data management bylaws for the Psychological Science Accelerator (PSA), a distributed network of laboratories dedicated to completing large-scale collaborative behavioral science projects. Our bylaws are organized around the principles of ethical data use, security, accuracy, usability, transparency. We describe how these embodied throughout the lifecycle of a PSA project, from project proposal to data release. In addition to setting up the policies and guidelines for the management of PSA data, we hope this document can provide a useful best practices for individual researchers thinking about the management of their own behavioral science data.


2019 ◽  
Author(s):  
Matthew Goldberg ◽  
Sander van der Linden

In a large-scale replication effort, Klein et al., (2018) investigate the variation in replicability and effect size across many different samples and settings. The authors concluded that, for any given effect being studied, heterogeneity across samples and settings does not explain failures to replicate. In the current commentary, we argue that the heterogeneity observed indeed has implications for replication failures, as well as for statistical power and theory development. We argue that psychological scientific research questions should be contextualized—considering how historical, political, or cultural circumstances might affect study results. We discuss how a perspectivist approach to psychological science is a fruitful way for designing research that aims to explain effect size heterogeneity.


Sign in / Sign up

Export Citation Format

Share Document