2010
DOI: 10.1007/978-3-642-15558-1_18
|View full text |Cite
|
Sign up to set email alerts
|

Human Attributes from 3D Pose Tracking

Abstract: Abstract. We show that, from the output of a simple 3D human pose tracker one can infer physical attributes (e.g., gender and weight) and aspects of mental state (e.g., happiness or sadness). This task is useful for man-machine communication, and it provides a natural benchmark for evaluating the performance of 3D pose tracking methods (vs. conventional Euclidean joint error metrics). Based on an extensive corpus of motion capture data, with physical and perceptual ground truth, we analyze the inference of sub… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2011
2011
2016
2016

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 23 publications
(9 citation statements)
references
References 31 publications
0
9
0
Order By: Relevance
“…Gallagher and Chen have explored inferring gender and age from visual features combined with names [15]. Gender, age and weight attributes have also been successfully extracted from 3D motion capture data [26]. These approaches generally require careful alignment of the data, and most of them apply to frontal faces only.…”
Section: Related Workmentioning
confidence: 99%
“…Gallagher and Chen have explored inferring gender and age from visual features combined with names [15]. Gender, age and weight attributes have also been successfully extracted from 3D motion capture data [26]. These approaches generally require careful alignment of the data, and most of them apply to frontal faces only.…”
Section: Related Workmentioning
confidence: 99%
“…Troje's main focus was the perception of gait. He has shown that the whole‐body movements of gait contain information that allow human observers or computer classification algorithms to distinguish, for example, between males and females (Troje, ), young and old, happy or sad, and relaxed or nervous walkers (Sigal et al., ). He extracted this information by first separating the whole‐body movements into sets of principal movement directions that he called “eigenpostures” and then linearizing the principal movements by approximating them with sinusodial functions.…”
mentioning
confidence: 99%
“…Our research was motivated by work of Sigal et al [18] who demonstrated that on the basis of 3D articulated pose estimates it is possible to infer subtle physical attributes of human, like gender and weight, and even some aspects of mental state, e.g., happiness or sadness. As it was previously mentioned, the model-based approaches less dependent on the training data in comparison to methods that are based on holistic space-time features or space-time shapes.…”
Section: Background and Related Workmentioning
confidence: 99%