scholarly journals Ethics and Law in Research on Algorithmic and Data-Driven Technology in Mental Health Care: Scoping Review

10.2196/24668 ◽  
2021 ◽  
Vol 8 (6) ◽  
pp. e24668
Author(s):  
Piers Gooding ◽  
Timothy Kariotis

Background Uncertainty surrounds the ethical and legal implications of algorithmic and data-driven technologies in the mental health context, including technologies characterized as artificial intelligence, machine learning, deep learning, and other forms of automation. Objective This study aims to survey empirical scholarly literature on the application of algorithmic and data-driven technologies in mental health initiatives to identify the legal and ethical issues that have been raised. Methods We searched for peer-reviewed empirical studies on the application of algorithmic technologies in mental health care in the Scopus, Embase, and Association for Computing Machinery databases. A total of 1078 relevant peer-reviewed applied studies were identified, which were narrowed to 132 empirical research papers for review based on selection criteria. Conventional content analysis was undertaken to address our aims, and this was supplemented by a keyword-in-context analysis. Results We grouped the findings into the following five categories of technology: social media (53/132, 40.1%), smartphones (37/132, 28%), sensing technology (20/132, 15.1%), chatbots (5/132, 3.8%), and miscellaneous (17/132, 12.9%). Most initiatives were directed toward detection and diagnosis. Most papers discussed privacy, mainly in terms of respecting the privacy of research participants. There was relatively little discussion of privacy in this context. A small number of studies discussed ethics directly (10/132, 7.6%) and indirectly (10/132, 7.6%). Legal issues were not substantively discussed in any studies, although some legal issues were discussed in passing (7/132, 5.3%), such as the rights of user subjects and privacy law compliance. Conclusions Ethical and legal issues tend to not be explicitly addressed in empirical studies on algorithmic and data-driven technologies in mental health initiatives. Scholars may have considered ethical or legal matters at the ethics committee or institutional review board stage. If so, this consideration seldom appears in published materials in applied research in any detail. The form itself of peer-reviewed papers that detail applied research in this field may well preclude a substantial focus on ethics and law. Regardless, we identified several concerns, including the near-complete lack of involvement of mental health service users, the scant consideration of algorithmic accountability, and the potential for overmedicalization and techno-solutionism. Most papers were published in the computer science field at the pilot or exploratory stages. Thus, these technologies could be appropriated into practice in rarely acknowledged ways, with serious legal and ethical implications.

2020 ◽  
Author(s):  
Piers Gooding ◽  
Timothy Kariotis

BACKGROUND Uncertainty surrounds the ethical and legal implications of algorithmic and data-driven technologies in the mental health context, including technologies variously characterised as artificial intelligence, machine learning and deep learning. OBJECTIVE We aimed to survey scholarly literature on algorithmic and data-driven technologies used in online mental health interventions with a view to identify the legal and ethical issues raised. METHODS We searched for peer-reviewed literature about algorithmic decision systems in mental healthcare used in online platforms. Scopus, Embase and ACM were searched. 1078 relevant peer-reviewed research studies were identified, which were narrowed to 132 empirical research papers for review based on selection criteria. We thematically analysed the papers to address our aims. RESULTS We grouped the findings into five categories of technology: social media (n=53), smartphones (n=37), sensing technology (n=20), chatbots (n=5), and other/miscellaneous (n=17). Most initiatives were directed toward “detection and diagnosis”. Most papers discussed privacy, principally in terms of respecting research participants” privacy, with relatively little discussion of privacy in context. A small number of studies discussed ethics as an explicit category of concern (n=19). Legal issues were not substantively discussed in any studies, though seven studies noted some legal issues in passing, such as the rights of user-subjects and compliance with relevant privacy and data protection law. CONCLUSIONS Ethics tend not to be explicitly addressed in the broad scholarship on algorithmic and data-driven technologies in online mental health initiatives—even less so legal issues. Scholars may have considered ethical or legal matters at the ethics committee/institutional review board stage of their empirical research but this consideration seldom appears in published material in any detail. We identify several concerns, including the near complete lack of involvement of service users, the scant consideration of ‘algorithmic accountability’, and the potential for over-medicalisation and techno-solutionism. Most papers were published in the computer science field at a pilot or exploratory stage. Thus, these technologies could be appropriated into practice in rarely acknowledged ways, with serious legal and ethical implications.


Information ◽  
2020 ◽  
Vol 11 (3) ◽  
pp. 170
Author(s):  
Tineke Broer

Digital and networking technologies are increasingly used to predict who is at risk of attempting suicide. Such digitalized suicide prevention within and beyond mental health care raises ethical, social and legal issues for a range of actors involved. Here, I will draw on key literature to explore what issues (might) arise in relation to digitalized suicide prevention practices. I will start by reviewing some of the initiatives that are already implemented, and address some of the issues associated with these and with potential future initiatives. Rather than addressing the breadth of issues, however, I will then zoom in on two key issues: first, the duty of care and the duty to report, and how these two legal and professional standards may change within and through digitalized suicide prevention; and secondly a more philosophical exploration of how digitalized suicide prevention may alter human subjectivity. To end with the by now famous adagio, digitalized suicide prevention is neither good nor bad, nor is it neutral, and I will argue that we need sustained academic and social conversation about who can and should be involved in digitalized suicide prevention practices and, indeed, in what ways it can and should (not) happen.


1994 ◽  
Vol 15 (3) ◽  
pp. 471-478
Author(s):  
Karl Menninger

2015 ◽  
Vol 17 (3) ◽  
pp. 185-201
Author(s):  
Callie Joubert

The mission of the U.S. National Institute of Mental Health is to transform understanding and treatment of mental disorders. According to its former director, Dr. Thomas Insel, fundamental to its mission is the proposition that “mental illnesses are brain disorders.” The aim of this article is to examine this proposition and to argue that it does not make sense. As a scientific proposition, it is based on contentious empirical claims, and as a metaphysical proposition, it is consistent with those who claim that a person is a brain. A conceptual analysis is employed as a tool to show that it is a category mistake to ascribe psychological properties of a person to a brain. The article concludes with a brief indication of the ethical implications of Insel’s proposition for mental health care.


2020 ◽  
Vol 35 (6) ◽  
pp. 657-664
Author(s):  
Jessica J Fitts ◽  
Fatmata Gegbe ◽  
Mark S Aber ◽  
Daniel Kaitibi ◽  
Musa Aziz Yokie

Abstract Though mental and substance use disorders are a leading cause of disability worldwide, mental health systems are vastly under-resourced in most low- and middle-income countries and the majority of people with serious mental health needs receives no formal treatment. Despite international calls for the integration of mental health into routine care, availability of outpatient mental health services and integration of mental health into the broader healthcare system remain weak in many countries. Efforts to strengthen mental healthcare systems must be informed by the local context, with attention to key health system components. The current study is a qualitative analysis of stakeholder perspectives on mental health system strengthening in one low-income country, Sierra Leone. It utilizes locally grounded knowledge from frontline healthcare providers to identify constraints and opportunities for strengthening mental health care within each component of the health system. In-depth semi-structured interviews were conducted with 43 participants including doctors, nurses, community health workers, mental health advocates, mental health specialists, and traditional healers recruited from the Bo, Moyamba and Western Area Urban Districts. Interview transcripts were content-coded in NVivo using both a priori and emergent codes and aggregated into broader themes, utilizing the World Health Organization Health Systems Framework. Participants described an extremely limited system of mental health care, with constraints and obstacles within each health system component. Participants identified potential strategies to help overcome these constraints. Findings reinforce the importance of factors outside of the healthcare system that shape the implementation of mental health initiatives, including pervasive stigma towards mental illness, local conceptualizations of mental illness and an emphasis on traditional treatment approaches. Implications for mental health initiatives in Sierra Leone and other low-income countries include a need for investment in primary care clinics to support integrated mental health services and the importance of engaging communities to promote the utilization of mental health services.


Psychotherapy ◽  
2019 ◽  
Vol 56 (4) ◽  
pp. 459-469 ◽  
Author(s):  
Heather J. Muir ◽  
Alice E. Coyne ◽  
Nicholas R. Morrison ◽  
James F. Boswell ◽  
Michael J. Constantino

Author(s):  
Barbara A. Weiner ◽  
Robert M. Wettstein

Sign in / Sign up

Export Citation Format

Share Document