2011 International Conference on Indoor Positioning and Indoor Navigation 2011
DOI: 10.1109/ipin.2011.6071928
|View full text |Cite
|
Sign up to set email alerts
|

KL-divergence kernel regression for non-Gaussian fingerprint based localization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
69
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 62 publications
(69 citation statements)
references
References 9 publications
0
69
0
Order By: Relevance
“…when measuring RSSI from J multiple access points), Mirowski et al (2011) make the assumption of local independence of each AP's marginal distribution and use the chain rule for relative entropy (Cover and Thomas, 2006) to express the KL-divergence of a joint distribution of independent variables. One can indeed argue that the WiFi software most likely queries and receives answers from the APs independently, and that the fluctuations in signal propagation for various APs happen along somewhat different paths.…”
Section: Kullback-leibler Divergence Kernel Regressionmentioning
confidence: 99%
See 4 more Smart Citations
“…when measuring RSSI from J multiple access points), Mirowski et al (2011) make the assumption of local independence of each AP's marginal distribution and use the chain rule for relative entropy (Cover and Thomas, 2006) to express the KL-divergence of a joint distribution of independent variables. One can indeed argue that the WiFi software most likely queries and receives answers from the APs independently, and that the fluctuations in signal propagation for various APs happen along somewhat different paths.…”
Section: Kullback-leibler Divergence Kernel Regressionmentioning
confidence: 99%
“…One can indeed argue that the WiFi software most likely queries and receives answers from the APs independently, and that the fluctuations in signal propagation for various APs happen along somewhat different paths. Mirowski et al (2011) propose to combine the KL-divergence with kernel 2 methods and to use kernelbased regression algorithms. Following (Moreno et al, 2004), and for a data-dependent range of values α, it is possible to define such positive semi-definite kernels by exponentiating the symmetrized KL-divergence:…”
Section: Kullback-leibler Divergence Kernel Regressionmentioning
confidence: 99%
See 3 more Smart Citations