2020
DOI: 10.1002/qj.3740
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced parallelization of the incremental 4D‐Var data assimilation algorithm using the Randomized Incremental Optimal Technique

Abstract: Incremental 4D-Var is a data assimilation algorithm used routinely at operational numerical weather prediction (NWP) centres worldwide. The algorithm solves a series of quadratic minimization problems (inner-loops) obtained from linear approximations of the forward model around nonlinear trajectories (outer-loops). Since most of the computational burden is associated with the inner-loops, many studies have focused on developing computationally efficient algorithms to solve the least-square quadratic minimizati… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 42 publications
0
9
0
Order By: Relevance
“…A different approach has been explored by Bousserez and Henze (2018) and Bousserez et al . (2020), who presented and tested a randomised solution algorithm called the Randomized Incremental Optimal Technique (RIOT) in data assimilation. RIOT is designed to be used instead of PCG and employs a randomised eigenvalue decomposition of the Hessian (using a different method from the ones presented in this article) to construct directly the solution x in Equation (), which approximates the solution given by PCG.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…A different approach has been explored by Bousserez and Henze (2018) and Bousserez et al . (2020), who presented and tested a randomised solution algorithm called the Randomized Incremental Optimal Technique (RIOT) in data assimilation. RIOT is designed to be used instead of PCG and employs a randomised eigenvalue decomposition of the Hessian (using a different method from the ones presented in this article) to construct directly the solution x in Equation (), which approximates the solution given by PCG.…”
Section: Discussionmentioning
confidence: 99%
“…In this work we apply randomised methods to generate a preconditioner, which is then used to accelerate the solution of the exact inner loop problem (11) with PCG method (as discussed in Section 4). A different approach has been explored by Bousserez and Henze (2018) and Bousserez et al (2020), who presented and tested a randomised solution algorithm called the Randomized Incremental Optimal Technique (RIOT) in data assimilation. RIOT is designed to be used instead of PCG and employs a randomised eigenvalue decomposition of the Hessian (using a different method than the ones presented in this paper) to directly construct the solution x in (11), which approximates the solution given by PCG.…”
Section: Accepted Articlementioning
confidence: 99%
“…Randomised methods for low‐rank matrix approximations have attracted a lot of interest in recent years because they require matrix products with blocks of vectors that can be easily parallelised, and it has been shown that good approximations for matrices with rapidly decaying singular values can be obtained with high probability (e.g., Halko et al ., 2011, Martinsson and Tropp 2020). These methods have been explored in data assimilation when designing solvers for strong constraint 4D‐Var (Bousserez et al ., 2020) and preconditioning for the forcing formulation of the incremental weak constraint 4D‐Var (Daužickaitė et al ., 2021).…”
Section: Randomised Preconditioningmentioning
confidence: 99%
“…In the light of the growing popularity of randomised methods and examples of their use in data assimilation (Bousserez et al ., 2020, Daužickaitė et al ., 2021), we propose using a randomised singular value decomposition (RSVD; Halko et al ., 2011) to approximate the tangent linear model. RSVD is a block method that is easy to parallelise in the sense that it requires calculating matrix products with blocks of vectors.…”
Section: Introductionmentioning
confidence: 99%
“…In the light of the growing popularity of randomised methods and examples of their use in data assimilation (Bousserez et al [2020], Daužickaitė et al [2021]), we propose using a randomised singular value decomposition (RSVD) (Halko et al [2011]) to approximate the tangent linear model. RSVD is a block method that is easy to parallelise in the sense that it requires calculating matrix products with blocks of vectors.…”
Section: Introductionmentioning
confidence: 99%