2020
DOI: 10.1080/09638237.2020.1714011
|View full text |Cite
|
Sign up to set email alerts
|

‘AI gone mental’: engagement and ethics in data-driven technology for mental health

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
30
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 59 publications
(40 citation statements)
references
References 19 publications
0
30
0
Order By: Relevance
“…Of the scant commentary and research in the field by persons with psychosocial disabilities and service users, commentators have raised concerns about: the potential need for a right to explanation concerning algorithmic decision making for individuals (not only the right of an individual to understand how a decision about them was made but also to query the values that go into a particular algorithmic decision system) [ 196 ]; the risk of discrimination or harm where sensitive personal information is leaked, stolen, sold, or scraped from social media [ 197 ]; and the deployment of data-driven technologies in coercive psychiatric interventions and policing [ 19 , 196 ]. Keyword searches along these lines did not yield any relevant results.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Of the scant commentary and research in the field by persons with psychosocial disabilities and service users, commentators have raised concerns about: the potential need for a right to explanation concerning algorithmic decision making for individuals (not only the right of an individual to understand how a decision about them was made but also to query the values that go into a particular algorithmic decision system) [ 196 ]; the risk of discrimination or harm where sensitive personal information is leaked, stolen, sold, or scraped from social media [ 197 ]; and the deployment of data-driven technologies in coercive psychiatric interventions and policing [ 19 , 196 ]. Keyword searches along these lines did not yield any relevant results.…”
Section: Discussionmentioning
confidence: 99%
“…Such concerns are spurred by questions about which systems deserve to be built, which problems most need to be addressed, and who is best placed to build and monitor them [ 198 ]. Scholarship on algorithmic and data-driven technologies in mental health services appears to have seldom asked such questions, at least explicitly ([ 196 ]; notable exceptions include [ 17 ] and [ 23 ]). The debate about algorithmic accountability in mental health care is likely to accelerate in the coming years amid broader calls for algorithmic decision systems to be subject to contest, account, and redress to citizens and representatives of the public interest.…”
Section: Discussionmentioning
confidence: 99%
“…AI algorithms have several other complex applications, notably, predictive modeling. 58 Broadly, predictive modeling leverages large quantities of personal data to uncover patterns to predict future health outcomes, which could inform treatment selection and treatment personalization. 59 However, this approach fails to recognize the central role of the patients, especially when their personal data will be used for developing such algorithms.…”
Section: Artificial Intelligence and Privacy In Mental Healthmentioning
confidence: 99%
“… 59 However, this approach fails to recognize the central role of the patients, especially when their personal data will be used for developing such algorithms. 58 Consequently, the mental health patient is not sufficiently mentioned as a central collaborator, or the final beneficiary to whom both clinicians and data scientists are accountable. 60 These challenges related to the use of AI in mental health research and practice demand far greater scrutiny and effort on the part of regulators and policy makers to safeguard the personal data privacy of individuals with mental health conditions.…”
Section: Artificial Intelligence and Privacy In Mental Healthmentioning
confidence: 99%
“…Gender diversity is importantwomen are half the worldbut it is not the only type of diversity we need to consider there is also ethnicity and now that gender is not a binary concept it is harder to identify this author attribute via publication. The JMH has emphasised the inclusion of people who use mental health services as authors (Byrne et al, 2019;Pinfold et al, 2019;Robertson et al, 2019;Robotham et al, 2016;Webber et al, 2014) and we have been successful on our editorial board, in our editorials (Carr, 2020;Sweeney & Taggart, 2018;Wykes et al, 2019) and in the types of research we accept (Farr et al, 2019;Happell et al, 2019b;Happell et al, 2019a;McCabe et al, 2018;Mulfinger et al, 2019). We want to continue to encourage these papers and particularly those led by service users.…”
mentioning
confidence: 92%