2018
DOI: 10.1137/17m114073x
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic Forecast Uncertainty and the Unstable Subspace in the Presence of Additive Model Error

Abstract: It is well understood that dynamic instability is among the primary drivers of forecast uncertainty in chaotic, physical systems. Data assimilation techniques have been designed to exploit this phenomena, reducing the effective dimension of the data assimilation problem to the directions of rapidly growing errors. Recent mathematical work has, moreover, provided formal proofs of the central hypothesis of the Assimilation in the Unstable Subspace methodology of Anna Trevisan and her collaborators: for filters a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

4
47
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 23 publications
(52 citation statements)
references
References 54 publications
4
47
0
Order By: Relevance
“…Although asymptotically neutral or weakly-stable, these directions may display high variance in the local error growth rate, thus be often intermittently unstable. As rigorously proved by [23,24] this situation is known to drive the error upwell from the unfiltered to the filtered subspace, eventually leading to divergence. It seems thus paramount that the EnKF ensemble subspace encompasses at least all of these near-neutral directions, preferably also the asymptotically weakly stable..…”
Section: Resultsmentioning
confidence: 91%
“…Although asymptotically neutral or weakly-stable, these directions may display high variance in the local error growth rate, thus be often intermittently unstable. As rigorously proved by [23,24] this situation is known to drive the error upwell from the unfiltered to the filtered subspace, eventually leading to divergence. It seems thus paramount that the EnKF ensemble subspace encompasses at least all of these near-neutral directions, preferably also the asymptotically weakly stable..…”
Section: Resultsmentioning
confidence: 91%
“…Recall from Remark 3.2 that this is as good as can be hoped for linear systems in the presence of noisy observations. We stress that the observational noise introduces an additional non-linear term pQ Q ⊺ H ⊺ η in the error equation (28); hence, large p > 0 not only makes the discretized equation stiff but, importantly amplifies the noise! On the other hand, Q ⊺ acts as a projection onto the range of H ⊺ HQ, and thus Q ⊺ H ⊺ η in fact represents the projection of η onto the range of HQ.…”
Section: L96mentioning
confidence: 99%
“…Now, define g(τ ; s, t) := ξ ⊺ (t)∇f i (t, τ sξ(t) + x(t)). We get that f i (t, ξ(t) + x(t)) − f i (t, x(t)) − ξ ⊺ (t)∇f i (t, x(t)) = and so the estimation error ξ solves (28). Now, S1 follows 10 from [7, p.27, Thm.…”
Section: Burgers Equationmentioning
confidence: 99%
See 2 more Smart Citations