2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019
DOI: 10.1109/iros40897.2019.8968506
|View full text |Cite
|
Sign up to set email alerts
|

Deep orientation: Fast and Robust Upper Body orientation Estimation for Mobile Robotic Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(10 citation statements)
references
References 22 publications
0
10
0
Order By: Relevance
“…In comparison to other works, our approach outperforms [5] and it is -under good conditions-outperformed by [7]. Although [7] reports better results, it requires RGBD cameras, which are an order of magnitude more expensive than low-resolution RGB cameras.…”
Section: Discussionmentioning
confidence: 65%
See 3 more Smart Citations
“…In comparison to other works, our approach outperforms [5] and it is -under good conditions-outperformed by [7]. Although [7] reports better results, it requires RGBD cameras, which are an order of magnitude more expensive than low-resolution RGB cameras.…”
Section: Discussionmentioning
confidence: 65%
“…Although there is a considerable number of exceptions (e.g., [5]- [7]) orientation and other social cues are usually acquired using a two-stage pipeline: human body parts are detected as a first step and then passed as input to a second stage algorithm. This second algorithm is frequently implemented using basic trigonometry, considering the coordinates of the shoulders or the hips [1].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Besides the position, the orientations of persons to be tracked and their postures (standing, sitting, squatting) are important information for making decisions during HRI. Therefore, the detected point cloud segments are further processed by a CNN to classify posture [24] and upper body orientation [27]. The observations of these properties are modeled as discrete distributions with three bins for the posture classes and eight bins for the orientation.…”
Section: Posture and Orientationmentioning
confidence: 99%