2018
DOI: 10.2118/191305-pa
|View full text |Cite
|
Sign up to set email alerts
|

Correlation-Based Adaptive Localization for Ensemble-Based History Matching: Applied To the Norne Field Case Study

Abstract: Summary Ensemble-based methods are among the state-of-the-art history-matching algorithms. However, in practice, they often suffer from ensemble collapse, a phenomenon that deteriorates history-matching performance. It is customary to equip an ensemble history-matching algorithm with a localization scheme to prevent ensemble collapse. Conventional localization methods use distances between the physical locations of model variables and observations to modify the degree of the observations’ influe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 29 publications
(23 citation statements)
references
References 6 publications
0
17
0
Order By: Relevance
“…In our SHM framework, however, the effective observations (e.g., wavelet coefficients) resulting from sparse data representation do not have associated physical locations, and this makes it challenging to apply distance-based localization. As a result, we adopt a correlation-based adaptive localization scheme instead, which helps overcome the aforementioned problem, while achieving a few additional benefits (Luo et al, 2018b(Luo et al, , 2019. More details of the implementation of correlation-based adaptive localization will also be provided in the case-study part later.…”
Section: Model Updatesmentioning
confidence: 99%
See 2 more Smart Citations
“…In our SHM framework, however, the effective observations (e.g., wavelet coefficients) resulting from sparse data representation do not have associated physical locations, and this makes it challenging to apply distance-based localization. As a result, we adopt a correlation-based adaptive localization scheme instead, which helps overcome the aforementioned problem, while achieving a few additional benefits (Luo et al, 2018b(Luo et al, , 2019. More details of the implementation of correlation-based adaptive localization will also be provided in the case-study part later.…”
Section: Model Updatesmentioning
confidence: 99%
“…In this work, the TSVD is conducted following the previous work Luo et al (2019). On the other hand, we adopt correlation-based adaptive localization, since in practice it tends to be more flexible than distance-based localization, as elaborated in Luo et al (2019).…”
Section: Model Updates Without Accounting For Model Errorsmentioning
confidence: 99%
See 1 more Smart Citation
“…Iterative forms of the EnKF and ES, usually denoted by IEnKF (Gu and Oliver, 2007;Sakov et al, 2012) and IES (Chen and Oliver, 2013;Emerick and Reynolds, 2013;Luo et al, 2015;Chang et al, 2017;Li et al, 2018), have been developed to improve assimilation performance in scenarios characterized by strongly nonlinear behaviors. A variety of studies investigate challenges linked to such (ensemble) data assimilation algorithms, including, e.g., the possibility of coping with non-Gaussian model parameter distributions (Zhou et al, 2011;Li et al, 2018), physical unphysical results stemming from the estimation workflow (Wen and Chen, 2006;Song et al, 2014), or spurious correlations (Panzeri et al, 2013;Bauser et al, 2018;Luo et al, 2019;Soares et al, 2019). All of these works contribute to improve the robustness of these algorithms for parameter estimation in complex environmental systems.…”
Section: Introductionmentioning
confidence: 99%
“…There are also some schemes to update the model parameters selectively in EnKF. It is called as the localization scheme, and these schemes are divided into distance-based (Chen and Oliver 2010;Jung et al 2017b), attribute-based (e.g., streamline and temperature distribution) (Arroyo-Negrete et al 2008;Huang and Zeng 2016), and correlation-based process (Luo et al 2018(Luo et al , 2019.…”
Section: Introductionmentioning
confidence: 99%