How Failure to Falsify contributes to the Replication Crisis in the Clinical Neurosciences

2021 ◽  
Author(s):  
Frank Hillary ◽  
Sarah Rajtmajer

Abstract:This critical review discusses evidence for the replication crisis in the clinical neuroscience literature with focus on the size of the literature and how scientific hypotheses are framed and tested. We aim to reinvigorate discussions born from philosophy of science regarding falsification (see Popper, 1959;1962) but with hope to bring pragmatic application that might give real leverage to attempts to address scientific reproducibility. The surging publication rate has not translated to unparalleled scientific progress so the current “science-by-volume” approach requires new perspective for determining scientific ground truths. We describe an example from the network neurosciences in the study of traumatic brain injury where there has been little effort to refute two prominent hypotheses leading to a literature without resolution. Based upon this example, we discuss how building strong hypotheses and then designing efforts to falsify them can bring greater precision to the clinical neurosciences. With falsification as the goal, we can harness big data and computational power to identify the fitness of each theory to advance the neurosciences.

2017 ◽  
Vol 21 (4) ◽  
pp. 308-320 ◽  
Author(s):  
Mark Rubin

Hypothesizing after the results are known, or HARKing, occurs when researchers check their research results and then add or remove hypotheses on the basis of those results without acknowledging this process in their research report ( Kerr, 1998 ). In the present article, I discuss 3 forms of HARKing: (a) using current results to construct post hoc hypotheses that are then reported as if they were a priori hypotheses; (b) retrieving hypotheses from a post hoc literature search and reporting them as a priori hypotheses; and (c) failing to report a priori hypotheses that are unsupported by the current results. These 3 types of HARKing are often characterized as being bad for science and a potential cause of the current replication crisis. In the present article, I use insights from the philosophy of science to present a more nuanced view. Specifically, I identify the conditions under which each of these 3 types of HARKing is most and least likely to be bad for science. I conclude with a brief discussion about the ethics of each type of HARKing.


2019 ◽  
Author(s):  
Marc N Coutanche ◽  
Lauren S. Hallion

A rapid growth in computational power and an increasing availability of large, publicly- accessible, multimodal datasets present new opportunities for psychology and neuroscience researchers to ask novel questions, and to approach old questions in novel ways. Studies of the personal characteristics, situation-specific factors, and sociocultural contexts that result in the onset, development, maintenance, and remission of psychopathology, are particularly well-suited to benefit from machine learning methods. However, introductory textbooks for machine learning rarely tailor their guidance to the needs of psychology and neuroscience researchers. Similarly, the traditional statistical training of clinical scientists often does not incorporate these approaches. This chapter acts as an introduction to machine learning for researchers in the fields of clinical psychology and clinical neuroscience. We discuss these methods, illustrated through real and hypothetical applications in the fields of clinical psychology and clinical neuroscience. We touch on study design, selecting appropriate techniques, how (and how not) to interpret results, and more, to aid researchers who are interested in applying machine learning methods to clinical science data.


2020 ◽  
Vol 1 (6) ◽  
Author(s):  
Roberta Malaguarnera ◽  
Alessandra Scamporrino ◽  
Agnese Filippello ◽  
Stefania Di Mauro ◽  
Alessandro Minardo ◽  
...  

Glycemic homeostasis is an essential mechanism for the proper working of an organism. However, balance in blood lipid and protein levels also plays an important role. The discovery of the hormone insulin and the description of its function for glycemic control made fundamental scientific progress in this field. However, since then our view of the problem has been deeply influenced only in terms of glucose and insulin (in an insulin-centric and glucose-centric way). Based on recent scientific discoveries, a fine and sophisticated network of hormonal and metabolic interactions, involving almost every apparatus and tissue of the human body, has been theorized. Efficient metabolic homeostasis is founded on these intricate interactions. Although it is still not fully defined, this complex network can undergo alterations that lead to metabolic disorders such as diabetes mellitus (DM). The endocrine pancreas plays a crucial role in the metabolic balance of an organism, but insulin is just one of the elements involved and each single pancreatic islet hormone is worthy of our concern. Moreover, pancreatic hormones need to be considered in a general view, concerning both their systemic function as direct mediators and as hormones, which, in turn, are regulated by other hormones or other substances. This more complex scenario should be taken into account for a better understanding of the pathophysiology and the therapeutic algorithms of DM. As a consequence, improvements in modern medicine could help to contemplate this new perspective. This review is focused on some aspects of gut-pancreas interaction, aiming to integrate this synergy into a wider context involving other organs and tissues.


Author(s):  
Xiongtao Zhang ◽  
Nan Xiang ◽  
Qihua Chen ◽  
Zhengyi Zhong ◽  
Hui Yan ◽  
...  

2020 ◽  
Vol 39 (10) ◽  
pp. 753-754
Author(s):  
Jiajia Sun ◽  
Daniele Colombo ◽  
Yaoguo Li ◽  
Jeffrey Shragge

Geophysicists seek to extract useful and potentially actionable information about the subsurface by interpreting various types of geophysical data together with prior geologic information. It is well recognized that reliable imaging, characterization, and monitoring of subsurface systems require integration of multiple sources of information from a multitude of geoscientific data sets. With increasing data volumes and computational power, new data types, constant development of inversion algorithms, and the advent of the big data era, Geophysics editors see multiphysics integration as an effective means of meeting some of the challenges arising from imaging subsurface systems with higher resolution and reliability as well as exploring geologically more complicated areas. To advance the field of multiphysics integration and to showcase its added value, Geophysics will introduce a new section “Multiphysics and Joint Inversion” in 2021. Submissions are accepted now.


2014 ◽  
Vol 12 (2) ◽  
pp. 93-106 ◽  
Author(s):  
Tobias Matzner

Purpose – Ubiquitous computing and “big data” have been widely recognized as requiring new concepts of privacy and new mechanisms to protect it. While improved concepts of privacy have been suggested, the paper aims to argue that people acting in full conformity to those privacy norms still can infringe the privacy of others in the context of ubiquitous computing and “big data”. Design/methodology/approach – New threats to privacy are described. Helen Nissenbaum's concept of “privacy as contextual integrity” is reviewed concerning its capability to grasp these problems. The argument is based on the assumption that the technologies work, persons are fully informed and capable of deciding according to advanced privacy considerations. Findings – Big data and ubiquitous computing enable privacy threats for persons whose data are only indirectly involved and even for persons about whom no data have been collected and processed. Those new problems are intrinsic to the functionality of these new technologies and need to be addressed on a social and political level. Furthermore, a concept of data minimization in terms of the quality of the data is proposed. Originality/value – The use of personal data as a threat to the privacy of others is established. This new perspective is used to reassess and recontextualize Helen Nissenbaum's concept of privacy. Data minimization in terms of quality of data is proposed as a new concept.


Sign in / Sign up

Export Citation Format

Share Document