Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2022
DOI: 10.18653/v1/2022.naacl-main.174
|View full text |Cite
|
Sign up to set email alerts
|

A Shoulder to Cry on: Towards A Motivational Virtual Assistant for Assuaging Mental Agony

Abstract: Mental Health Disorders continue plaguing humans worldwide. Aggravating this situation is the severe shortage of qualified and competent mental health professionals (MHPs), which underlines the need for developing Virtual Assistants (VAs) that can assist MHPs. The data+ML for automation can come from platforms that allow visiting and posting messages in peerto-peer anonymous manner for sharing their experiences (frequently stigmatized) and seeking support. In this paper, we propose a VA that can act as the fir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 12 publications
(11 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…Prior work has illustrated the importance of trust for continued voice assistant use [31,35], as trust is pivotal to user adoption of voice assistants [33,45] and willingness to broaden the scope of voice assistant tasks [27]. It is especially important to support trust-building between users and voice assistants as researchers continue to imagine and develop new capabilities for them, including complex tasks such as supporting healthcare tasks [41,55], giving mental health advice [52,66], and other high stakes decision-making [17].…”
Section: User Expectations and Trust In Voice Assistantsmentioning
confidence: 99%
See 1 more Smart Citation
“…Prior work has illustrated the importance of trust for continued voice assistant use [31,35], as trust is pivotal to user adoption of voice assistants [33,45] and willingness to broaden the scope of voice assistant tasks [27]. It is especially important to support trust-building between users and voice assistants as researchers continue to imagine and develop new capabilities for them, including complex tasks such as supporting healthcare tasks [41,55], giving mental health advice [52,66], and other high stakes decision-making [17].…”
Section: User Expectations and Trust In Voice Assistantsmentioning
confidence: 99%
“…Within the past five years, advancements in NLP have achieved huge gains in accuracy when tested against standard datasets [8,18,34,56,58,60,63], with state-ofthe-art accuracy in natural language processing models as high as 99% for certain tasks [8,34]. This has led many practitioners and researchers alike to imagine a near future where voice assistants can be used in increasingly complex ways, including supporting healthcare tasks [41,55], giving mental health advice [52,66], and high stakes decision-making [17].…”
Section: Introductionmentioning
confidence: 99%
“…Most of the papers ( 43%) we reviewed do not deal with a specific mental health condition but work towards general mental health well-being (Saha et al, 2022a). The methods proposed in such papers are applicable to the symptoms associated with a broad range of mental health issues (e.g.…”
Section: Mental Health Categorymentioning
confidence: 99%
“…An advantage of the models proposed in these papers is that they could potentially offer support to a broad group of users irrespective of the underlying mental health condition. Papers without a target demographic and a target mental health category focus on proposing methods such as using generative language models for psychotherapy (Das et al, 2022a), or to address specific modules of the CAs such as leveraging reinforcement learning for response generation (Saha et al, 2022b).…”
Section: Target Demographicmentioning
confidence: 99%
See 1 more Smart Citation