2018
DOI: 10.1109/jbhi.2017.2676878
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Multimodal Measurement of Depression Severity Using Deep Autoencoding

Abstract: Depression is one of the most common psychiatric disorders worldwide, with over 350 million people affected. Current methods to screen for and assess depression depend almost entirely on clinical interviews and self-report scales. While useful, such measures lack objective, systematic, and efficient ways of incorporating behavioral observations that are strong indicators of depression presence and severity. Using dynamics of facial and head movement and vocalization, we trained classifiers to detect three leve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

5
85
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 145 publications
(111 citation statements)
references
References 65 publications
5
85
0
1
Order By: Relevance
“…Automated emotion recognition from facial expression is an active area of research [26, 29]. In clinical contexts, investigators have detected occurrence of depression, autism, conflict, and PTSD from visual features (i.e., face and body expression or movement) [7, 10, 18, 22, 25, 27]. In the current pilot study, we explored the feasibility of detecting changes in affect in response to time-locked changes in neurophysiological challenge.…”
Section: Introductionmentioning
confidence: 99%
“…Automated emotion recognition from facial expression is an active area of research [26, 29]. In clinical contexts, investigators have detected occurrence of depression, autism, conflict, and PTSD from visual features (i.e., face and body expression or movement) [7, 10, 18, 22, 25, 27]. In the current pilot study, we explored the feasibility of detecting changes in affect in response to time-locked changes in neurophysiological challenge.…”
Section: Introductionmentioning
confidence: 99%
“…In these datasets, interactions include two interlocutors, who are recorded carrying out both structured and unstructured conversations. ere are numerous examples of such datasets, with a wide range of applications, such as speech recognition [15], behavior analysis [50], segmentation, emotion recognition [12] and depression detection [16]. Arguably, one of the most popular datasets of one-to-one interactions is SEMAINE [30].…”
Section: Related Workmentioning
confidence: 99%
“…In the field of depression assessment from visual cues, however, we could find only two published reports. Dibeklioglu et al [32] employed Stacked Denoising Autoencoders in a multimodal context to perform video classification according to three levels of depressive symptomatology on the Pittsburgh dataset. Moreover, Zhu et al [33] employed Deep Convolutional Neural Networks to achieve the highest performance among the unimodal (visual) approaches addressing the aim of AVEC'13 and AVEC'14 competitions.…”
Section: Deep Learningmentioning
confidence: 99%