scholarly journals Privacy versus Open Science

2019 ◽  
Author(s):  
Simon Dennis ◽  
Paul Michael Garrett ◽  
Hyungwook Yim ◽  
Jihun Hamm ◽  
Adam F Osth ◽  
...  

Pervasive internet and sensor technologies promise to revolutionize psychological science. However, the data collected using these technologies is often very personal - indeed the value of the data is often directly related to how personal it is. At the same time, driven by the replication crisis, there is a sustained push to publish data to open repositories. These movements are in fundamental conflict. In this paper, we propose a way to navigate this issue. We argue that there are significant advantages to be gained by ceding the ownership of data to the participants who generate it. Then we provide desiderata for a privacy-preserving platform. In particular, we suggest that researchers should use an interface to perform experiments and run analyses rather than observing the stimuli themselves. We argue that this method not only improves privacy but will also encourage greater compliance with good research practices than is possible with open repositories.

2020 ◽  
Author(s):  
Soufian Azouaghe ◽  
Adeyemi Adetula ◽  
Patrick S. Forscher ◽  
Dana Basnight-Brown ◽  
Nihal Ouherrou ◽  
...  

The quality of scientific research is assessed not only by its positive impact on socio-economic development and human well-being, but also by its contribution to the development of valid and reliable scientific knowledge. Thus, researchers regardless of their scientific discipline, are supposed to adopt research practices based on transparency and rigor. However, the history of science and the scientific literature teach us that a part of scientific results is not systematically reproducible (Ioannidis, 2005). This is what is commonly known as the "replication crisis" which concerns the natural sciences as well as the social sciences, of which psychology is no exception.Firstly, we aim to address some aspects of the replication crisis and Questionable Research Practices (QRPs). Secondly, we discuss how we can involve more labs in Africa to take part in the global research process, especially the Psychological Science Accelerator (PSA). For these goals, we will develop a tutorial for the labs in Africa, by highlighting the open science practices. In addition, we emphasize that it is substantial to identify African labs needs and factors that hinder their participating in the PSA, and the support needed from the Western world. Finally, we discuss how to make psychological science more participatory and inclusive.


2021 ◽  
Author(s):  
Chelsea Moran ◽  
Alexandra Richard ◽  
Kate Wilson ◽  
Rosemary Twomey ◽  
Adina Coroiu

Background: Questionable research practices (QRPs) have been identified as a driving force of the replication crisis in the field of psychological science. The aim of this study was to assess the frequency of QRP use among psychology students in Canadian universities, and to better understand reasons and motivations for QRP use.Method: Participants were psychology students attending Canadian universities and were recruited via online advertising and university email invitations to complete a bilingual survey. Respondents were asked how often they and others engaged in seven QRPs. They were also asked to estimate the proportion of psychology research impacted by each QRP and how acceptable they found each QRP. Data were collected through Likert-scale survey items and open-ended text responses between May 2020 and January 2021, and was analyzed using descriptive statistics and thematic analysis. Results: 425 psychology students completed the survey. The sample consisted of 40% undergraduate students, 59% graduate students and 1% post-doctoral fellows. Overall, 64% of participants reported using at least one QRP, while 79% reported having observed others engaging in at least one QRP. The most frequently reported QRPs were p-hacking (46%), not submitting null results for publication (31%), excluding outcome measures (30%), and hypothesizing after results are known (27%). These QRPs were also the most frequently observed in others, estimated to be the most prevalent in the field, and rated as the most acceptable. Qualitative findings show that students reported that pressures to publish motivated their QRP use, with some reporting that certain QRPs are justifiable in some cases (e.g., in the case of exploratory research). Students also reported that QRPs contribute to the replication crisis and to publication bias and offered several alternatives and solutions to engaging in QRPs, such as gaining familiarity with open science practices. Conclusions: Most Canadian psychology students in this sample report using QRPs, which is unsurprising since they observe such practices in their research environment and estimate that they are prevalent. In contrast, most students believe that QRPs are not acceptable. The results of this study highlight the need to examine the pedagogical standards and cultural norms in academia that may promote or normalize QRPs in psychological science, to improve the quality and replicability of research in this field.


2021 ◽  
Vol 37 (4) ◽  
pp. 1-6
Author(s):  
Jason M. Lodge ◽  
Linda Corrin ◽  
Gwo-Jen Hwang ◽  
Kate Thompson

Over the last decade a spate of issues has been emerging in empirical research spanning diverse fields such as biology, medicine, economics, and psychological science. The crisis has already led to fundamental shifts in how research is being conducted in several fields, particularly psychological science. Broadly labelled the ‘replication crisis’, these issues place substantial doubt on the robustness of peer-reviewed quantitative research across many disciplines. In this editorial, we will delve into the replication crisis and what it means for educational technology research. We will address two key areas, describing the extent to which the replication crisis applies to educational technology research and suggestions for responses by our community.


2021 ◽  
Author(s):  
P. Priscilla Lui ◽  
Monica C. Skewes ◽  
Sarah Gobrial ◽  
David Rollock

To answer questions about human psychology, psychological science needs to yield credible findings. Because of their goals of understanding people’s lived experiences and advocating for the needs of the Native communities, Indigenous scholars tend to use community-based participatory research (CBPR) or approach science from a constructivist framework. The primary goal of mainstream psychological science is to uncover generalizable facts about human functioning. Approached from a postpositivist framework, mainstream psychological scholars tend to assume the possibility of identifying researcher biases and achieving objective science. Recently, many psychological findings fail to replicate in new samples. The replication crisis raised concerns about the validity of psychological science. The mainstream open science has been promoted as a solution to this replication crisis; the open science movement encourages researchers to emphasize transparency and accountability to the broad research community. The notion of transparency aligns with the principles of CBPR—research approach common in Indigenous research. Yet, open science practices are not widely adopted in Indigenous research, and mainstream open science does not emphasize researchers’ accountability to the communities that their science is intended to serve. We examined Indigenous researchers’ awareness and concerns about mainstream open science. Participants endorsed the value of transparency with the participants and their communities. They also were concerned about being disadvantaged and the possible negative impact of data sharing on the Native communities. We suggest that there is value in connecting mainstream open science and Indigenous research to advance science that empowers people and makes positive community impact.


2019 ◽  
Vol 19 (1) ◽  
pp. 46-59
Author(s):  
Alexandra Sarafoglou ◽  
Suzanne Hoogeveen ◽  
Dora Matzke ◽  
Eric-Jan Wagenmakers

The current crisis of confidence in psychological science has spurred on field-wide reforms to enhance transparency, reproducibility, and replicability. To solidify these reforms within the scientific community, student courses on open science practices are essential. Here we describe the content of our Research Master course “Good Research Practices” which we have designed and taught at the University of Amsterdam. Supported by Chambers’ recent book The 7 Deadly Sins of Psychology, the course covered topics such as QRPs, the importance of direct and conceptual replication studies, preregistration, and the public sharing of data, code, and analysis plans. We adopted a pedagogical approach that: (a) reduced teacher-centered lectures to a minimum; (b) emphasized practical training on open science practices; and (c) encouraged students to engage in the ongoing discussions in the open science community on social media platforms.


2018 ◽  
Author(s):  
Olivier Klein ◽  
Tom Elis Hardwicke ◽  
Frederik Aust ◽  
Johannes Breuer ◽  
Henrik Danielsson ◽  
...  

The credibility of scientific claims depends upon the transparency of the research products upon which they are based (e.g., study protocols, data, materials, and analysis scripts). As psychology navigates a period of unprecedented introspection, user-friendly tools and services that support open science have flourished. There has never been a better time to embrace transparent research practices. However, the plethora of decisions and choices involved can be bewildering. Here we provide a practical guide to help researchers navigate the process of preparing and sharing the products of their research. Being an open scientist means adopting a few straightforward research management practices, which lead to less error prone, reproducible research workflows. Further, this adoption can be piecemeal – each incremental step towards complete transparency adds positive value. Transparent research practices not only improve the efficiency of individual researchers, they enhance the credibility of the knowledge generated by the scientific community.


2017 ◽  
Vol 12 (4) ◽  
pp. 660-664 ◽  
Author(s):  
Scott O. Lilienfeld

The past several years have been a time for soul searching in psychology, as we have gradually come to grips with the reality that some of our cherished findings are less robust than we had assumed. Nevertheless, the replication crisis highlights the operation of psychological science at its best, as it reflects our growing humility. At the same time, institutional variables, especially the growing emphasis on external funding as an expectation or de facto requirement for faculty tenure and promotion, pose largely unappreciated hazards for psychological science, including (a) incentives for engaging in questionable research practices, (b) a single-minded focus on programmatic research, (c) intellectual hyperspecialization, (d) disincentives for conducting direct replications, (e) stifling of creativity and intellectual risk taking, (f) researchers promising more than they can deliver, and (g) diminished time for thinking deeply. Preregistration should assist with (a), but will do little about (b) through (g). Psychology is beginning to right the ship, but it will need to confront the increasingly deleterious impact of the grant culture on scientific inquiry.


2019 ◽  
Vol 15 (1) ◽  
pp. 579-604 ◽  
Author(s):  
Jennifer L. Tackett ◽  
Cassandra M. Brandes ◽  
Kevin M. King ◽  
Kristian E. Markon

Despite psychological scientists’ increasing interest in replicability, open science, research transparency, and the improvement of methods and practices, the clinical psychology community has been slow to engage. This has been shifting more recently, and with this review, we hope to facilitate this emerging dialogue. We begin by examining some potential areas of weakness in clinical psychology in terms of methods, practices, and evidentiary base. We then discuss a select overview of solutions, tools, and current concerns of the reform movement from a clinical psychological science perspective. We examine areas of clinical science expertise (e.g., implementation science) that should be leveraged to inform open science and reform efforts. Finally, we reiterate the call to clinical psychologists to increase their efforts toward reform that can further improve the credibility of clinical psychological science.


2019 ◽  
Author(s):  
Richard Ramsey

The credibility of psychological science has been questioned recently, due to low levels of reproducibility and the routine use of inadequate research practices (Chambers, 2017; Open Science Collaboration, 2015; Simmons, Nelson, & Simonsohn, 2011). In response, wide-ranging reform to scientific practice has been proposed (e.g., Munafò et al., 2017), which has been dubbed a “credibility revolution” (Vazire, 2018). My aim here is to advocate why and how we should embrace such reform, and discuss the likely implications.


Author(s):  
Toby Prike

AbstractRecent years have seen large changes to research practices within psychology and a variety of other empirical fields in response to the discovery (or rediscovery) of the pervasiveness and potential impact of questionable research practices, coupled with well-publicised failures to replicate published findings. In response to this, and as part of a broader open science movement, a variety of changes to research practice have started to be implemented, such as publicly sharing data, analysis code, and study materials, as well as the preregistration of research questions, study designs, and analysis plans. This chapter outlines the relevance and applicability of these issues to computational modelling, highlighting the importance of good research practices for modelling endeavours, as well as the potential of provenance modelling standards, such as PROV, to help discover and minimise the extent to which modelling is impacted by unreliable research findings from other disciplines.


Sign in / Sign up

Export Citation Format

Share Document