2009
DOI: 10.2118/118952-pa
|View full text |Cite
|
Sign up to set email alerts
|

History Matching With Parameterization Based on the Singular Value Decomposition of a Dimensionless Sensitivity Matrix

Abstract: In gradient-based automatic history matching, calculation of the derivatives (sensitivities) of all production data with respect to gridblock rock properties and other model parameters is not feasible for large-scale problems. Thus, the Gauss-Newton (GN) method and Levenberg-Marquardt (LM) algorithm, which require calculation of all sensitivities to form the Hessian, are seldom viable. For such problems, the quasi-Newton and nonlinear conjugate gradient algorithms present reasonable alternatives because these … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
59
0
1

Year Published

2010
2010
2018
2018

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 46 publications
(64 citation statements)
references
References 35 publications
(39 reference statements)
4
59
0
1
Order By: Relevance
“…It can be seen that the initial inflation factor needs to be very high in order to satisfy the regularization conditions but it decreases to a much smaller value after a few iterations. This result is consistent with previous works of and Tavakoli and Reynolds (2010) that show that controlling the change in parameters during the first few iterations is critical in order to avoid over correction of model parameters and unrealistically rough property fields. Fig.…”
Section: Resultssupporting
confidence: 92%
See 1 more Smart Citation
“…It can be seen that the initial inflation factor needs to be very high in order to satisfy the regularization conditions but it decreases to a much smaller value after a few iterations. This result is consistent with previous works of and Tavakoli and Reynolds (2010) that show that controlling the change in parameters during the first few iterations is critical in order to avoid over correction of model parameters and unrealistically rough property fields. Fig.…”
Section: Resultssupporting
confidence: 92%
“…One important lesson from previous works on Gauss-Newton is that excessive modification of the model parameters at each iteration is not ideal and often makes the model susceptible to overshooting (Li et al, 2003;Tavakoli and Reynolds, 2010). One way to alleviate this issue is to increase the inflation factor so that the change in model parameters at each iteration is below a reasonable threshold.…”
Section: Es-mda-rsmentioning
confidence: 99%
“…Even although these parameters are spatially correlated there are typically many more unknowns in the vector  than can be resolved from the data d. Regularization in the form of prior geological knowledge is therefore essential (Gavalas et al 1976). Alternatively, various techniques to reduce the size of the parameter estimation problem have been proposed using, e.g., zonation (Gavalas et al 1976), pilot points (Bissell et al 1997), wavelets (Sahni and Horne, 2005), eigenvalue decomposition of the parameter covariance matrix (Gavalas et al 1976) and its nonlinear version, kernel principal component analysis (Sarma et al 2007), the discrete cosine transform (Jafarpour and McLaughlin, 2009), compressed sensing (Jafarpour et al 2010) or sensitivity matrix decomposition (Tavakoli and Reynolds, 2010).…”
Section: Application Case Reservoir Engineeringmentioning
confidence: 99%
“…In this context, several studies have indicated a significant scope for reservoir model-based life-cycle optimization of ultimate recovery or NPV, especially when combined with computer-assisted history matching leading to a closed-loop reservoir management (CLRM) approach; see e.g. On the other hand, Tavakoli et al (2010) and Van Doren et al (2011) have used parameterizations of geological structures with sensitivity analyses that indicate that only a reduced set of parameters can be validated from data. To properly include the effect of uncertainty it is important to perform both the history matching and the life-cycle optimization using ensembles of multiple realizations.…”
Section: Introductionmentioning
confidence: 99%