Interspeech 2017 2017
DOI: 10.21437/interspeech.2017-218
|View full text |Cite
|
Sign up to set email alerts
|

Attention Networks for Modeling Behaviors in Addiction Counseling

Abstract: In psychotherapy interactions there are several desirable and undesirable behaviors that give insight into the efficacy of the counselor and the progress of the client. It is important to be able to identify when these target behaviors occur and what aspects of the interaction signal their occurrence. Manual observation and annotation of these behaviors is costly and time intensive. In this paper, we use long short term memory networks equipped with an attention mechanism to process transcripts of addiction co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 20 publications
(21 citation statements)
references
References 16 publications
0
21
0
Order By: Relevance
“…Several computational models have been proposed for predicting MISC behavioral codes at the utterance level [2,3,9]. Researchers have addressed this problem by using variety of features, such as word n-grams and linguistic features [1] and recurrent neural networks (RNNs) with word embedding features [3,11]. Methods using RNNs have shown superior performance to other models (e.g., MaxEnt) for utterance level behavioral code prediction [3].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Several computational models have been proposed for predicting MISC behavioral codes at the utterance level [2,3,9]. Researchers have addressed this problem by using variety of features, such as word n-grams and linguistic features [1] and recurrent neural networks (RNNs) with word embedding features [3,11]. Methods using RNNs have shown superior performance to other models (e.g., MaxEnt) for utterance level behavioral code prediction [3].…”
Section: Related Workmentioning
confidence: 99%
“…Self-attention mechanisms, which enable models to attend to particular words based on input for predicting output classes, have been used widely in natual language processing [12,13,14] and speech processing [15]. Recently [11] extended the work from [3] by using a self-attention mechanism for predicting utterance level MISC codes. They show how attention can improve the interpretability and help in better understanding the decisions made by the model.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Although most of the research efforts focus on the therapist's language and speech, it has been shown that examining patient's language can be beneficial for specific behavior cues [9]. More recently, text-only approaches to assess MI have been possible thanks to deep learning models [10][11][12][13].…”
Section: Introductionmentioning
confidence: 99%
“…Several methods for automatic assessment of treatment effectiveness have been proposed. Most of them rely on audio and linguistic cues (e.g., empathy and behavioral codes in therapy [28,29]). Other methods explored the use of unsupervised topic modelling techniques as a higher-level measure of content, specifically their relation to mental health outcomes [30,31].…”
Section: Previous Workmentioning
confidence: 99%