2013 Humaine Association Conference on Affective Computing and Intelligent Interaction 2013
DOI: 10.1109/acii.2013.53
|View full text |Cite
|
Sign up to set email alerts
|

Head Pose and Movement Analysis as an Indicator of Depression

Abstract: Abstract-Depression is a common and disabling mental health disorder, which impacts not only on the sufferer but also their families, friends and the economy overall. Our ultimate aim is to develop an automatic objective affective sensing system that supports clinicians in their diagnosis and monitoring of clinical depression. Here, we analyse the performance of head pose and movement features extracted from face videos using a 3D face model projected on a 2D Active Appearance Model (AAM). In a binary classifi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

3
56
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 99 publications
(68 citation statements)
references
References 25 publications
3
56
0
Order By: Relevance
“…Significant related work has been done on the automatic analysis and prediction of depression using speech and prosody [28,21,23,46], gestures, head pose, and facial expressions [2,33,43,19], and a multimodal combination of these cues [49,20,27,26]. As described in the literature, there are a variety of challenges in automatically determining whether a person is depressed, including: 1) each behavioral signal provides only partial information, which must be combined to form a more realistic model for recognizing behaviors indicative of depression; 2) some information that is relevant may not be available or may be inherently hidden; 3) defining baseline behavior can be difficult with limited behavioral data.…”
Section: Introductionmentioning
confidence: 99%
“…Significant related work has been done on the automatic analysis and prediction of depression using speech and prosody [28,21,23,46], gestures, head pose, and facial expressions [2,33,43,19], and a multimodal combination of these cues [49,20,27,26]. As described in the literature, there are a variety of challenges in automatically determining whether a person is depressed, including: 1) each behavioral signal provides only partial information, which must be combined to form a more realistic model for recognizing behaviors indicative of depression; 2) some information that is relevant may not be available or may be inherently hidden; 3) defining baseline behavior can be difficult with limited behavioral data.…”
Section: Introductionmentioning
confidence: 99%
“…The latter use signal processing, computer vision, and pattern recognition methodologies. From the computer-science perspective, research has sought to identify depression from vocal utterances [8], [9], [10], [11], [12], [13], facial expression [14], [15], [16], [17], head movements/pose [18], [16], [19], body movements [18], and gaze [20]. While most research is limited to a single modality, there is increasing interest in multimodal approaches to depression detection [21], [22].…”
Section: Introductionmentioning
confidence: 99%
“…For face analysis, the fiducial points detection is performed using the Intraface library 2 . Using the fiducial points, affine warp is computed, which gives the aligned faces.…”
Section: Methodsmentioning
confidence: 99%
“…Recently, [18], [2] also used head movements when analysing depression data. A histogram of head movements was proposed by [18], it is experimented of the Black Dog Institute depression data.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation