2014
DOI: 10.1109/tim.2013.2277514
|View full text |Cite
|
Sign up to set email alerts
|

Accurate Human Navigation Using Wearable Monocular Visual and Inertial Sensors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 54 publications
(31 citation statements)
references
References 31 publications
0
31
0
Order By: Relevance
“…By plugging in (18), (19), (20) and dropping the term P (η 1 , η 2 , · · · η n ) (as it does not depend on θ ), we then obtain the optimal solution on θ :…”
Section: Maximum Posteriori Estimationmentioning
confidence: 99%
See 4 more Smart Citations
“…By plugging in (18), (19), (20) and dropping the term P (η 1 , η 2 , · · · η n ) (as it does not depend on θ ), we then obtain the optimal solution on θ :…”
Section: Maximum Posteriori Estimationmentioning
confidence: 99%
“…The inertial based arm motion tracking [5] takes advantage of IMU measurements and employs the AGOF filter to estimate the body joint orientations. In [18], the system combines both IMU and monocular camera to conduct the orientation estimation based upon Kalman filter. However, it over-depends upon inertial measurements, which turns out to be less accurate than our method.…”
Section: Visual-inertial Motion Trackingmentioning
confidence: 99%
See 3 more Smart Citations