Data Assimilation for Atmospheric, Oceanic and Hydrologic Applications (Vol. IV) 2022
DOI: 10.1007/978-3-030-77722-7_3
|View full text |Cite
|
Sign up to set email alerts
|

Filtering with One-Step-Ahead Smoothing for Efficient Data Assimilation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 81 publications
0
5
0
Order By: Relevance
“…This is consistent with the results of Raboudi et al . (2018), who argued that the two‐stage update step constrains the ensemble sampling with information from “future” data, which helps to mitigate issues related to undersampling and strong nonlinearities (Ait‐El‐Fquih and Hoteit, 2022a). On average, OSA smoothing improves the filtering results by about 20%$$ 20\% $$ in both scenarios, with the corresponding state estimates clearly being less sensitive to the choice of inflation and localization values.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…This is consistent with the results of Raboudi et al . (2018), who argued that the two‐stage update step constrains the ensemble sampling with information from “future” data, which helps to mitigate issues related to undersampling and strong nonlinearities (Ait‐El‐Fquih and Hoteit, 2022a). On average, OSA smoothing improves the filtering results by about 20%$$ 20\% $$ in both scenarios, with the corresponding state estimates clearly being less sensitive to the choice of inflation and localization values.…”
Section: Resultsmentioning
confidence: 99%
“…Filtering with one‐step‐ahead (OSA) smoothing is an alternative filtering strategy (Ait‐El‐Fquih and Hoteit, 2022a). It involves an extra OSA smoothing step with the “future” observation between two successive analyses, within a Bayesian framework (Desbouvries et al ., 2011).…”
Section: Introductionmentioning
confidence: 99%
“…In light of the conditional independence properties of the augmented system of Equation (i.e., Equations – ), this system is a hidden Markov‐chain model endowed with a transition pdf and a likelihood, respectively given as (Ait‐El‐Fquih et al ., 2016; Ait‐El‐Fquih and Hoteit, 2022) alignleftalign-1p(zn|zn1,ϑ,q)align-2=p(xn|xn1,ϵn1,θ,ϕ)p(ϵn|ϵn1,ϕ,q),$$ p\left({\mathbf{z}}_n|{\mathbf{z}}_{n-1},\boldsymbol{\vartheta}, q\right)\kern0.5em =p\left({\mathbf{x}}_n|{\mathbf{x}}_{n-1},{\boldsymbol{\epsilon}}_{n-1},\boldsymbol{\theta}, \phi \right)p\left({\boldsymbol{\epsilon}}_n|{\boldsymbol{\epsilon}}_{n-1},\phi, q\right), $$ alignleftalign-1p(yn|zn)align-2=p(yn|xn).$$ p\left({\mathbf{y}}_n|{\mathbf{z}}_n\right)\kern0.5em =p\left({\mathbf{y}}_n|{\mathbf{x}}_n\right). $$ These are key ingredients that guarantee the recursivity of the proposed filtering algorithms (Ait‐El‐Fquih and Hoteit, 2022).…”
Section: Problem Formulationmentioning
confidence: 99%
“…Although the OSAS formulation of the state‐parameter filtering problem uses the observation three times during one assimilation cycle, this does not in any way affect the Bayesian character of the resulting algorithms, as this (same) observation is not used to update the same quantity (the current state), but three different quantities: the previous state, the parameter vector, and the current state.This has already been discussed in detail in Raboudi et al . (2018) (see also Desbouvries et al ., 2011; Ait‐El‐Fquih and Hoteit, 2022) in the context of state estimation, emphasizing in particular that, in linear Gaussian state‐space systems, KF and OSAS‐based KF provide exactly the same (theoretical) forecast and analysis estimates.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation