2014
DOI: 10.1007/s10489-014-0535-z
|View full text |Cite
|
Sign up to set email alerts
|

Real-time 3D human pose recovery from a single depth image using principal direction analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 10 publications
(21 citation statements)
references
References 23 publications
0
21
0
Order By: Relevance
“…Figure 1 describes the steps of our processes. First, our system used a human T-pose depth silhouette, yielding a body part-labeled map by pixel-supervised classification via random forests [15] for initialization. After initialization, each human depth silhouette that was represented as the set of 3-D points was matched to the previous silhouette via point set registration to obtain point correspondences.…”
Section: Related Work On Point Set Registrationmentioning
confidence: 99%
See 4 more Smart Citations
“…Figure 1 describes the steps of our processes. First, our system used a human T-pose depth silhouette, yielding a body part-labeled map by pixel-supervised classification via random forests [15] for initialization. After initialization, each human depth silhouette that was represented as the set of 3-D points was matched to the previous silhouette via point set registration to obtain point correspondences.…”
Section: Related Work On Point Set Registrationmentioning
confidence: 99%
“…We asked a subject to make a T-pose, obtained a depth silhouette, and created the body part-labeled map. To label body parts on the human depth silhouette of the T-pose, we used a pixel-wise supervised classification via trained random forests [15]: the training needed only a small synthetic T-pose database for labeling body parts. The human depth silhouette and its labeled map with fifteen labeled parts of a T-pose are shown in Fig.…”
Section: Initializationmentioning
confidence: 99%
See 3 more Smart Citations