2017
DOI: 10.3390/s17051037
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion

Abstract: In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 50 publications
(32 citation statements)
references
References 40 publications
0
28
0
Order By: Relevance
“…The sensorimotor channels connected to the VR define the degree of immersion; the psychological consequence of immersion on perception is the sense of presence that felt through being in the VE or, alternatively, the "perceptual illusion of non-mediation" with the VE (Riva, 2008;Bohil et al, 2011). Moreover, mobile applications (e.g., tablet) with tracking systems of the user and/or visors (e.g., Google Cardboard) can be considered mobile VR that allow for different degrees of immersion and interaction with the VE Fang et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…The sensorimotor channels connected to the VR define the degree of immersion; the psychological consequence of immersion on perception is the sense of presence that felt through being in the VE or, alternatively, the "perceptual illusion of non-mediation" with the VE (Riva, 2008;Bohil et al, 2011). Moreover, mobile applications (e.g., tablet) with tracking systems of the user and/or visors (e.g., Google Cardboard) can be considered mobile VR that allow for different degrees of immersion and interaction with the VE Fang et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…3. A camera projection matrix is a 3 × 4 matrix which describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image [17]. The visualization process uses Eq.…”
Section: ) Find Transformation Matrixmentioning
confidence: 99%
“…Fusion Type Application OKVIS [43][44][45] optimization-based monocular tightly coupled SR-ISWF [46] filtering-based monocular tightly coupled mobile phone [47] optimization-based monocular tightly coupled [48] optimization-based Stereo tightly coupled MAV [49] optimization-based rgb-d loosely coupled Mobile devices [50] filtering-based monocular tightly coupled ROVIO [51] filtering-based monocular tightly coupled UAV [52] optimization-based monocular tightly coupled autonomous vehicle [53] filtering-based stereo tightly coupled [54] optimization-based stereo tightly coupled [55] optimization-based monocular tightly coupled [56] optimization-based stereo tightly coupled [57] filtering-based monocular loosely coupled robot [58] optimization-based rgb-d loosely coupled [59] filtering-based stereo loosely coupled VIORB [60] optimization-based monocular tightly coupled MAV [61] optimization-based rgb-d tightly coupled [62] filtering-based monocular loosely coupled AR/VR [63] filtering-based Multi-camera tightly coupled MAV [64] filtering-based monocular tightly coupled UAV VINS-mono [16][17][18] optimization-based monocular tightly coupled MAV, AR [65] optimization-based monocular tightly coupled AR [66] optimization-based monocular tightly coupled [67] filtering-based monocular tightly coupled MAV VINet [68] end-to-end monocular / deep-learning [69] optimization-based event camera tightly coupled S-MSCKF [26] filtering-based stereo tightly coupled MAV [70] optimization-based monocular tightly coupled MAV [71] optimization-based stereomonocular tightly coupled PIRVS [72] filtering-based st...…”
Section: Year Paper Back-end Approach Camera Typementioning
confidence: 99%