2014 Ubiquitous Positioning Indoor Navigation and Location Based Service (UPINLBS) 2014
DOI: 10.1109/upinlbs.2014.7033733
|View full text |Cite
|
Sign up to set email alerts
|

Infrared local positioning system using phase differences

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(13 citation statements)
references
References 22 publications
0
10
0
Order By: Relevance
“…The distance error is a function of the distance between emitter and sensor , as well as the angle of incidence . We will not repeat the other constant parameters that model the system, which can be found in the works that describe it [ 44 , 45 ]. Finally, Figure 3 b shows the evolution of the standard deviation of the distance measurement error versus the distance in the xy-plane .…”
Section: Resultsmentioning
confidence: 99%
“…The distance error is a function of the distance between emitter and sensor , as well as the angle of incidence . We will not repeat the other constant parameters that model the system, which can be found in the works that describe it [ 44 , 45 ]. Finally, Figure 3 b shows the evolution of the standard deviation of the distance measurement error versus the distance in the xy-plane .…”
Section: Resultsmentioning
confidence: 99%
“…After the IR and camera processing blocks, two position estimates, trueX^bold-italicIR and trueX^bold-italicc, are obtained from the IR and camera sensors respectively. trueX^bold-italicIR is attained by hyperbolic trilateration from differences of distances [45] and trueX^bold-italicc is obtained by projecting the camera image plane onto the scene plane by means of a homography transformation. We assume both estimates are affected by bi-dimensional zero-mean Gaussian uncertainties, represented by their respective covariance matrices bold-italicIR, bold-italicc.…”
Section: Methods Descriptionmentioning
confidence: 99%
“…After the IR and camera processing blocks, two position estimates, X IR and X c , are obtained from the IR and camera sensors respectively. X IR is attained by hyperbolic trilateration from differences of distances [45] and X c is obtained by projecting the camera image plane onto the scene plane by means of an homography transformation. We assume A deep explanation of the IR positioning system (developed in the past) can be read in [17,45].…”
Section: Methods Descriptionmentioning
confidence: 99%
“…All features of the test bench are summarized in Table 2, including devices, BLC configuration and test conditions (notation and configuration indexes or labels correspond to those in Figure 7). The IR measurements and positioning system performance had already been developed and shown in past [45]. The camera data has been collected for fusion purposes, which constitutes the core of the results presented in this paper.…”
Section: Setupmentioning
confidence: 99%