2019
DOI: 10.1177/0004867419864428
|View full text |Cite
|
Sign up to set email alerts
|

The utility of artificial intelligence in suicide risk prediction and the management of suicidal behaviors

Abstract: Objective: Suicide is a growing public health concern with a global prevalence of approximately 800,000 deaths per year. The current process of evaluating suicide risk is highly subjective, which can limit the efficacy and accuracy of prediction efforts. Consequently, suicide detection strategies are shifting toward artificial intelligence platforms that can identify patterns within ‘big data’ to generate risk algorithms that can determine the effects of risk (and protective) factors on suicide outcomes, predi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
78
1
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 69 publications
(87 citation statements)
references
References 81 publications
(87 reference statements)
1
78
1
1
Order By: Relevance
“…Police helped people who were at risk of suicide after receiving an alert call from Facebook (Singer, 2018). The use of artificial intelligence and machine learning has received increased attention in recent days for suicide prediction (Fonseka et al, 2019). In addition to social media data, the digital assistant product makers such as Apple (Siri), Google Assistant and Amazon (Echo/Alexa) are augmenting the suicide prevention services by analysing to an indication of suicidal thoughts and behaviours (De et al ., 2018).…”
Section: Discussionmentioning
confidence: 99%
“…Police helped people who were at risk of suicide after receiving an alert call from Facebook (Singer, 2018). The use of artificial intelligence and machine learning has received increased attention in recent days for suicide prediction (Fonseka et al, 2019). In addition to social media data, the digital assistant product makers such as Apple (Siri), Google Assistant and Amazon (Echo/Alexa) are augmenting the suicide prevention services by analysing to an indication of suicidal thoughts and behaviours (De et al ., 2018).…”
Section: Discussionmentioning
confidence: 99%
“…The ability of AI to enhance risk prediction in mental health assessment may extend from mining unstructured data (eg, through text or continuous sensor monitoring), such as a study on sepsis, which revealed improved algorithm accuracy (deidentified data made available for review) [55]. The implementation of algorithm prediction systems in predicting the risk of suicide is currently limited to experimental and feasibility studies (with utility) without efficacy evaluation [16,[56][57][58].…”
Section: Digital Mental Health Implementation In a Hybrid Model Of Carementioning
confidence: 99%
“…Figure 1. The structure of the autoencoder, which consists of six 3D convolution layers, three max-pooling layers, and three upsampling layers, resulted in compressed images with sizes of (11,13,11,16) for feature extraction.…”
Section: Autoencoder and Supervised Machine Learning Analysismentioning
confidence: 99%
“…Traditional statistics can only differentiate different groups with suicidal ideation from those without, rather than detect which individual is at risk. New analytical methods, such as machine learning, have been used in an attempt to develop algorithms to classify individual risk [8][9][10][11]. One study used machine learning algorithms based on functional magnetic resonance imaging (fMRI) neural signatures of death-and life-related concepts to detect individuals with suicidal ideation with 91% accuracy [12].…”
Section: Introductionmentioning
confidence: 99%