2005
DOI: 10.1007/11573548_1
|View full text |Cite
|
Sign up to set email alerts
|

Gesture-Based Affective Computing on Motion Capture Data

Abstract: Abstract. This paper presents research using full body skeletal movements captured using video-based sensor technology developed by Vicon Motion Systems, to train a machine to identify different human emotions. The Vicon system uses a series of 6 cameras to capture lightweight markers placed on various points of the body in 3D space, and digitizes movement into x, y, and z displacement data. Gestural data from five subjects was collected depicting four emotions: sadness, joy, anger, and fear. Experimental resu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
75
0
1

Year Published

2006
2006
2019
2019

Publication Types

Select...
3
3
3

Relationship

2
7

Authors

Journals

citations
Cited by 160 publications
(80 citation statements)
references
References 18 publications
4
75
0
1
Order By: Relevance
“…Previous studies indicate that adult observers can determine an individual's emotion state from bodily motion cues alone (Dittrich et al, 1996;Heberlein et al, 2004;Kapur, Kapur, Virji-Babul et al, 2005). The current results indicate that typically developing children aged 4 to 8 are also quite capable of perceiving emotion from point-light defined actions.…”
Section: Discussionsupporting
confidence: 52%
See 1 more Smart Citation
“…Previous studies indicate that adult observers can determine an individual's emotion state from bodily motion cues alone (Dittrich et al, 1996;Heberlein et al, 2004;Kapur, Kapur, Virji-Babul et al, 2005). The current results indicate that typically developing children aged 4 to 8 are also quite capable of perceiving emotion from point-light defined actions.…”
Section: Discussionsupporting
confidence: 52%
“…Previous research suggests that typically developed observers can determine an individual's emotional state from point-light displays of that person's actions (Dittrich et al, 1996;Heberlein et al, 2004). Recent data from our laboratory using the same displays also showed that typical adults are able to recognise emotional states from point-light displays with 93% accuracy (Kapur, Kapur, Virji-Babul et al, 2005). To determine whether children with Down syndrome can identify the emotional cues available in body movements, the following experiment was conducted.…”
Section: Experiments 2 Emotive Human Movementmentioning
confidence: 99%
“…Other approaches have exploited the dynamics of gestures referring to few psychological studies reporting that the temporal dynamics play an important role for interpreting emotional displays [7]. Kapur et al [8] showed that very simple statistical measures of motion dynamics (e.g., velocity and acceleration) are sufficient for training successfully automatic classifiers (e.g., SVMs and decision trees classifier). The role of kinematic features has been further established by the study of Bernhardt and Robinson [9].…”
Section: A From Face To Bodymentioning
confidence: 99%
“…Portrayed emotional gestures, where expressions are produced by actors upon instructions. This category consists of explicit affective archetype gestures where the subjects are instructed to perform short actions or adopt postures that explicitly represent a given emotion [10], [23]. ii.)…”
Section: Related Workmentioning
confidence: 99%