2013
DOI: 10.1007/s10596-013-9351-5
|View full text |Cite
|
Sign up to set email alerts
|

Levenberg–Marquardt forms of the iterative ensemble smoother for efficient history matching and uncertainty quantification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
245
0
4

Year Published

2013
2013
2018
2018

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 293 publications
(262 citation statements)
references
References 17 publications
2
245
0
4
Order By: Relevance
“…Gratton et al (2013) discuss application of GN in a trust region framework, which has the limitation that a portion of the computationally expensive outer loop increments will be rejected. Some authors have successfully applied the Levenberg-Marquardt algorithm in EnKF (e.g., Chen and Oliver, 2013;Mandel et al, 2016) by adding a regularization term to the cost function. That method requires one to perform the inner loop approximation of [H δv ] −1 multiple times, once for each value of a scalar regularization parameter.…”
Section: Nonlinear Optimizationmentioning
confidence: 99%
“…Gratton et al (2013) discuss application of GN in a trust region framework, which has the limitation that a portion of the computationally expensive outer loop increments will be rejected. Some authors have successfully applied the Levenberg-Marquardt algorithm in EnKF (e.g., Chen and Oliver, 2013;Mandel et al, 2016) by adding a regularization term to the cost function. That method requires one to perform the inner loop approximation of [H δv ] −1 multiple times, once for each value of a scalar regularization parameter.…”
Section: Nonlinear Optimizationmentioning
confidence: 99%
“…One such study was performed by [75], who developed forms of the iterative ensemble smoother using the Levenberg-Marquardt method to increase computational efficiency and improve the quality of the history match. They applied this technique to the Brugge benchmark case and found that compared to the standard ensemble smoother, ensemble-based Gauss-Newton formulation, and the multiple data assimilation method, the Levenberg-Marquardt method gave the best results.…”
Section: History Matching As Precursor To Production Optimizationmentioning
confidence: 99%
“…Gu and Oliver (2007) use an iterated ensemble Kalman filter (with randomization) in the state space, with a linearization of the observation operator obtained by a regression on the increments given by the ensemble. This approach was extended by Chen and Oliver (2013) to a Levenberg-Marquardt method, with the regularization done by a multiplicative inflation of the covariance in the linearized problem rather than adding a Tikhonov regularization term. Liu et al (2008Liu et al ( , 2009) and Liu and Xiao (2013) minimize the (strong-constraint) 4DVAR objective function over linear combinations of the ensemble by computations in the observation space.…”
Section: Introductionmentioning
confidence: 99%
“…Bocquet and Sakov (2012) combined the IEnKF method of Sakov et al (2012) with an inflation-free approach to obtain a 4-D ensemble variational method, and with the LevenbergMarquardt method by adding a diagonal regularization to the Hessian. Bocquet and Sakov (2012) and Chen and Oliver (2013) used Levenberg-Marquardt for faster convergence, as an adaptive method between the steepest descent and the Gauss-Newton method rather than to overcome divergence. Bocquet and Sakov (2012) also considered scaling the ensemble to approximate the tangent operators ("bundle variant") as in Sakov et al (2012).…”
Section: Introductionmentioning
confidence: 99%