2023
DOI: 10.31234/osf.io/cuzvr
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Large language models could change the future of behavioral healthcare: A proposal for responsible development and evaluation

Abstract: Large language models (LLMs) built on artificial intelligence (AI) – such as ChatGPT and GPT-4 – hold immense potential to support, augment, or even replace psychotherapy. Enthusiasm about such applications is mounting in the field as well as industry. These developments promise to address insufficient mental healthcare system capacity and scale individual access to personalized treatments. However, clinical psychology is an uncommonly high stakes application domain for AI systems, as responsible and evidence-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 45 publications
(54 reference statements)
0
6
0
Order By: Relevance
“…6 The ability of large language models (LLMs) to mimic human language provides the opportunity to build digital mental health tools that may feel more natural and engaging. 7,8 Building on recent advances relating to LLMs, our goal was to create and examine the feasibility of a generative AI tool that can complement traditional cognitive behavioral therapies (CBTs) by facilitating a core therapeutic intervention: Socratic dialogue. 9,10 Given current limitations of LLMs, 7 our tool "Socrates 2.0" was designed to ultimately be used in conjunction with a licensed clinician to make CBTs' out-of-session practice of evaluating one's thoughts more engaging compared to traditional worksheets.…”
Section: Methodmentioning
confidence: 99%
See 2 more Smart Citations
“…6 The ability of large language models (LLMs) to mimic human language provides the opportunity to build digital mental health tools that may feel more natural and engaging. 7,8 Building on recent advances relating to LLMs, our goal was to create and examine the feasibility of a generative AI tool that can complement traditional cognitive behavioral therapies (CBTs) by facilitating a core therapeutic intervention: Socratic dialogue. 9,10 Given current limitations of LLMs, 7 our tool "Socrates 2.0" was designed to ultimately be used in conjunction with a licensed clinician to make CBTs' out-of-session practice of evaluating one's thoughts more engaging compared to traditional worksheets.…”
Section: Methodmentioning
confidence: 99%
“…We viewed the Socratic process as a good specific use case for an LLM-based tool that guides patients through this process. Given the advanced conversational abilities of LLMs, 7 we hypothesized that a generative AI tool would be able to engage patients in Socratic dialogue at least as well as a newly trained therapist, and that employing LLMs would make out-of-session practice, such as examining one's thoughts, more engaging for patients than worksheets. 21…”
Section: Clinical Backgroundmentioning
confidence: 99%
See 1 more Smart Citation
“…At home, social chatbots (Henkel et al, 2020;Pentina et al, 2023) are being increasingly used as personal assistants (e.g., Alexa, Siri, Fireflies, Google Assistant), friendship companions (e.g., Replika, Anima, Kajiwoto, Microsoft XiaoIce) or even relational agents for (mental) healthcare (e.g., Woebot; Wysa; Youper; Stade et al, 2023). At work, intelligent decision support systems are now being used as advisors in healthcare, finance, and the military to name a few (Černevičienė & Kabašinskas, 2022;Macey-Dare, 2023;Shortliffe, & Sepúlveda, 2018;Wasilow & Thorpe, 2019).…”
Section: The Revolution Of Artificial Intelligencementioning
confidence: 99%
“…Mental healthcare in particular is an area in which the abilities of LLMs and other forms of AI arguably have the most transformational potential (Hua et al 2024). This is because the degree of cognitive and administrative work required by mental healthcare clinician, particularly when working with the most complex problems, is one of the highest in all areas of healthcare (Stade et al 2023). Previous papers have argued that the use of generative AI in mental health requires particular safety diligence due to the interpersonal nature of mental healthcare (Hider and Wright 2023).…”
Section: Introductionmentioning
confidence: 99%