2019
DOI: 10.1007/978-3-030-22747-0_16
|View full text |Cite
|
Sign up to set email alerts
|

Tuning Covariance Localization Using Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 29 publications
0
4
0
Order By: Relevance
“…While covariance localization has been useful for ameliorating spurious correlations over long distances, they require exhaustive tuning of the localization radius for determining at which point the spurious correlations should be cut off [15,16]. More recently, to avoid the empirical tuning of the localization parameters, several schemes of adaptive localization have been proposed and validated for ensemble data assimilation depending on their specific implementations [17][18][19][20][21][22], e.g., using an ensemble of ensembles to mitigate spurious correlations such as sampling error correction and reducing correlation sampling for adaptive localization [23][24][25][26]; assuming the localization function can be taken as a power of the background error correlation function [27][28][29]; proposing the theory of optimal linear filtering for adaptive localization [30]; proposing a correlation-based adaptive localization method [31][32][33]; and considering machine learning algorithms to capture the adaptive localization parameters [34][35][36]. The adaptive localization technique has been developed in detail, but some of these methods make strong assumptions or do not always work in ensemble data assimilation (EDA)-like frameworks.…”
Section: Introductionmentioning
confidence: 99%
“…While covariance localization has been useful for ameliorating spurious correlations over long distances, they require exhaustive tuning of the localization radius for determining at which point the spurious correlations should be cut off [15,16]. More recently, to avoid the empirical tuning of the localization parameters, several schemes of adaptive localization have been proposed and validated for ensemble data assimilation depending on their specific implementations [17][18][19][20][21][22], e.g., using an ensemble of ensembles to mitigate spurious correlations such as sampling error correction and reducing correlation sampling for adaptive localization [23][24][25][26]; assuming the localization function can be taken as a power of the background error correlation function [27][28][29]; proposing the theory of optimal linear filtering for adaptive localization [30]; proposing a correlation-based adaptive localization method [31][32][33]; and considering machine learning algorithms to capture the adaptive localization parameters [34][35][36]. The adaptive localization technique has been developed in detail, but some of these methods make strong assumptions or do not always work in ensemble data assimilation (EDA)-like frameworks.…”
Section: Introductionmentioning
confidence: 99%
“…A data-driven approach is thus similar in spirit to the grid-search because a set of observations is required and processed during training. Similarly, one can also use machine learning to find appropriate localizations Moosavi et al (2019). Empirical localization functions (Lei et al, 2015), can also be use and are constructed, empirically, from a training set of obserations.…”
Section: Localization and Inflationmentioning
confidence: 99%
“…The ensemble size, i.e., the number of physics-based model runs, is typically the main factor that limits the efficiency of EnKF. For increasing the quality of the results when ensembles are small, heuristics correction methods such as covariance shrinkage [10][11][12] and localization [13][14][15] have been developed. As some form of heuristic correction is required for operation implementations of the ensemble Kalman filter, reducing the need for such heuristic corrections in operational implementations is an important and active area of research.…”
Section: Introductionmentioning
confidence: 99%