2013
DOI: 10.1002/qj.2150
|View full text |Cite
|
Sign up to set email alerts
|

B‐preconditioned minimization algorithms for variational data assimilation with the dual formulation

Abstract: Variational data assimilation problems arising in meteorology and oceanography require the solution of a regularized nonlinear least-squares problem. Practical solution algorithms are based on the incremental (Truncated Gauss-Newton) approach, which involves the iterative solution of a sequence of linear least-squares (quadratic minimization) sub-problems. Each sub-problem can be solved using a primal approach, where the minimization is performed in a space spanned by vectors of the size of the model control v… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
56
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 65 publications
(56 citation statements)
references
References 61 publications
(70 reference statements)
0
56
0
Order By: Relevance
“…(9), using such a preconditioning increases the size of the minimisation vector. It is therefore preferable in this case to precondition the cost function using the full background error covariance matrix, as suggested in earlier studies by Derber and Rosati (1989) and more recently by Gratton and Tshimanga (2009) and Gu¨rol et al (2014). Both methods of preconditioning are equivalent and will lead to the same solution.…”
Section: Discretisationmentioning
confidence: 99%
“…(9), using such a preconditioning increases the size of the minimisation vector. It is therefore preferable in this case to precondition the cost function using the full background error covariance matrix, as suggested in earlier studies by Derber and Rosati (1989) and more recently by Gratton and Tshimanga (2009) and Gu¨rol et al (2014). Both methods of preconditioning are equivalent and will lead to the same solution.…”
Section: Discretisationmentioning
confidence: 99%
“…The minimization proceeds via a Lanczos formulation of the restricted B-preconditioned conjugate gradient (CG) method. 24 Each sequence of linear minimizations of Eqn (1) proceeds via so-called inner-loops. When the dz k that minimizes J k has been identified, the estimate z k about which H is linearized is updated (a so-called outer-loop), and minimization of Eqn (1) proceeds again.…”
Section: Roms 4d-varmentioning
confidence: 99%
“…Still another formulation of 4DEnVar was suggested by Desroziers et al (), based on the minimization of the following cost function in observation space: J(δytrue_)=12δytrue_T(Htrue_Btrue_eHtrue_T+Rtrue_)δytrue_δytrue_Tdtrue_ and δxtrue_=Btrue_eHtrue_Tδytrue_. In this case, a reduced B conjugate gradient (Gürol et al ) can be used, which is similar to a double preconditioned conjugate gradient but with a size of control variable now equal to the number of observations P and a transformation of successive gradients, with the same size P , by matrix falseboldH_3.0235ptfalseboldB_normalefalseboldH_normalT: htrue_=Htrue_Btrue_eHtrue_Tgtrue_=Htrue_…”
Section: Denvar Formalismmentioning
confidence: 99%