2021
DOI: 10.1088/1361-6420/abd29b
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive regularisation for ensemble Kalman inversion

Abstract: We propose a new regularisation strategy for the classical ensemble Kalman inversion (EKI) framework. The strategy consists of: (i) an adaptive choice for the regularisation parameter in the update formula in EKI, and (ii) criteria for the early stopping of the scheme. In contrast to existing approaches, our parameter choice does not rely on additional tuning parameters which often have severe effects on the efficiency of EKI. We motivate our approach using the interpretation of EKI as a Gaussian approximation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
25
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 30 publications
(26 citation statements)
references
References 56 publications
(262 reference statements)
1
25
0
Order By: Relevance
“…However, we have observed in a variety of numerical examples (not reported here) that in nonlinear problems it is often necessary to run EKI up to times larger than 1 to obtain adequate approximation of the posterior mean and covariance. Providing a suitable stopping criteria for EKI is a topic of current research [56,32] beyond the scope of our work.…”
Section: )mentioning
confidence: 99%
See 1 more Smart Citation
“…However, we have observed in a variety of numerical examples (not reported here) that in nonlinear problems it is often necessary to run EKI up to times larger than 1 to obtain adequate approximation of the posterior mean and covariance. Providing a suitable stopping criteria for EKI is a topic of current research [56,32] beyond the scope of our work.…”
Section: )mentioning
confidence: 99%
“…• An important methodological question, still largely unresolved, is the development of adaptive and easily implementable line search schemes and stopping criteria for ensemble-based optimization schemes. An important work in this direction is [32]. • Another avenue for future methodological research is the development of iterative ensemble Kalman methods that are sparse-promoting, considering alternative regularizations beyond the least-squares objectives discussed in our paper [38,40,57].…”
mentioning
confidence: 99%
“…The effective sample size (over true importance weights) can be viewed as a measure of the χ 2 -divergence between the tempered posterior at λ −1 and λ [4]. In [11] this idea was extended to instead use an effective sample size based on the symmetric KL-divergence.…”
Section: Stepsize Selectionmentioning
confidence: 99%
“…There are many recent works that carefully describe the mathematics of EKI [5,10], that suggest improvements [11,12,13,14], and that explain how EKI can be used in machine learning [15]. An extension of EKI, such that the EKI ensemble is distributed according to a Bayesian posterior distribution, is discussed in [16] and is called the ensemble Kalman sampler.…”
Section: Introductionmentioning
confidence: 99%