2017
DOI: 10.3390/e19120655
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Nonlinear Filtering via Information Geometric Optimization

Abstract: Abstract:In this paper, Bayesian nonlinear filtering is considered from the viewpoint of information geometry and a novel filtering method is proposed based on information geometric optimization. Under the Bayesian filtering framework, we derive a relationship between the nonlinear characteristics of filtering and the metric tensor of the corresponding statistical manifold. Bayesian joint distributions are used to construct the statistical manifold. In this case, nonlinear filtering can be converted to an opti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…In this light, the present variational Bayesian derivation of the Kalman filter shows as a way to 1) preserve its classical connections to Bayesian inference for the exact (i.e., linear) case [49], 2) understand the Kalman filter as the optimal linear estimator for non-linear systems [17,85] in terms of a fixed-form (variational) Gaussian approximation of arbitrary distributions [73,50] and 3) link the filter to more recent (information) geometric treatments of probabilistic inference [65,72].…”
Section: Discussionmentioning
confidence: 99%
“…In this light, the present variational Bayesian derivation of the Kalman filter shows as a way to 1) preserve its classical connections to Bayesian inference for the exact (i.e., linear) case [49], 2) understand the Kalman filter as the optimal linear estimator for non-linear systems [17,85] in terms of a fixed-form (variational) Gaussian approximation of arbitrary distributions [73,50] and 3) link the filter to more recent (information) geometric treatments of probabilistic inference [65,72].…”
Section: Discussionmentioning
confidence: 99%
“…2.1.3]. Moreover, a solution of the minimization problem (4) with C(ϕ) ≡ d KL (f (X; ϕ), f (X)) corresponds to the maximum likelihood estimate of the parameters ϕ [27]. This analogy elucidates the connection between Bayesian inference and information geometry.…”
Section: Preliminariesmentioning
confidence: 93%
“…A closure approximation is needed to render the cross-correlation term w (t)π (X, t) computable. Subtracting (27) from (26), we obtain an equation for random fluctuations π (X, t), ∂π ∂t + µ a ∂π ∂X = ∂( s (X, t)π (X, t) − s π) ∂X , subject to π (X, t = 0) = 0.…”
Section: A3 MD For the Langevin Equation With Colored Noisementioning
confidence: 99%
“…The main challenges associated with passive multiple underwater target tracking are that the passively obtained information is highly nonlinear [5][6][7], the targets' range may be unobservable, and the data association uncertainty between passive measurements and targets is complicated.…”
Section: Introductionmentioning
confidence: 99%