2019
DOI: 10.1016/j.jprocont.2019.05.011
|View full text |Cite
|
Sign up to set email alerts
|

Constrained ensemble Kalman filter based on Kullback–Leibler divergence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 24 publications
0
8
0
Order By: Relevance
“…The evaluation indices used to measure the difference between two distributions such as Kullback–Leibler divergence (KL-divergence) and Jensen–Shannon divergence (JS-divergence), are given as follows with where P , M , and Q are discrete probability distributions and X is the probability space. Higher values of the evaluation indices indicate that a large difference exists between the two distributions.…”
Section: Proposed Data Enhancement Methodsmentioning
confidence: 99%
“…The evaluation indices used to measure the difference between two distributions such as Kullback–Leibler divergence (KL-divergence) and Jensen–Shannon divergence (JS-divergence), are given as follows with where P , M , and Q are discrete probability distributions and X is the probability space. Higher values of the evaluation indices indicate that a large difference exists between the two distributions.…”
Section: Proposed Data Enhancement Methodsmentioning
confidence: 99%
“…(There are other methods for dealing with linear inequality constraints in the EnKF, e.g. by using truncated normals [52] or using constrained optimization [5,56]. The point here is not to deal with inequality constraints per se, but to introduce transforms as a means of generalizing the EnKF to non-Gaussian distributions.)…”
Section: Gaussian Anamorphosismentioning
confidence: 99%
“…Substituting from (23) into (24) while using model properties (4), then after simple mathematical manipulations one gets the covariance matrices of the prediction error as given by (8) for i ∈ {1, 2, . .…”
Section: A: the Prediction Stepmentioning
confidence: 99%
“…Its main idea was to fuse the constraints and the auxiliary dynamics to achieve a constrained dynamical model on which the linear minimum mean square error estimator of the LEC system was applied [22]. For inequality-constrained nonlinear systems, algorithms based on the particle filter [23], the EnKF [24], and the interior point method [25] were developed. Other examples of constrained state estimators can be found in [26]- [29].…”
Section: Introduction a Literature Reviewmentioning
confidence: 99%