2017
DOI: 10.1002/2016wr020299
|View full text |Cite
|
Sign up to set email alerts
|

Large‐scale inverse model analyses employing fast randomized data reduction

Abstract: When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
17
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 25 publications
(18 citation statements)
references
References 62 publications
1
17
0
Order By: Relevance
“…Data worth analyses effectively allocate a sampling budget by identifying the most valuable data for measuring the outcome of interest, given the current level of understanding. Data worth analyses include formal techniques that use quantitative metrics to identify the most valuable data and informal methods that rely on intuition or trialand-error (Çamdevýren et al, 2005;Fernández-Gálvez et al, 2006;James & Gorelick, 1994;Jolliffe, 2011;Kikuchi, 2017;Lin et al, 2017;Mogheir et al, 2003;Strebelle, 2002;Sun et al, 2013;Tiedeman et al, 2003;Vereecken et al, 2008). Informal and formal methods are complementary with a joint goal of identifying field measurements to be both informative and efficient.…”
Section: Focus and Scopementioning
confidence: 99%
“…Data worth analyses effectively allocate a sampling budget by identifying the most valuable data for measuring the outcome of interest, given the current level of understanding. Data worth analyses include formal techniques that use quantitative metrics to identify the most valuable data and informal methods that rely on intuition or trialand-error (Çamdevýren et al, 2005;Fernández-Gálvez et al, 2006;James & Gorelick, 1994;Jolliffe, 2011;Kikuchi, 2017;Lin et al, 2017;Mogheir et al, 2003;Strebelle, 2002;Sun et al, 2013;Tiedeman et al, 2003;Vereecken et al, 2008). Informal and formal methods are complementary with a joint goal of identifying field measurements to be both informative and efficient.…”
Section: Focus and Scopementioning
confidence: 99%
“…Simulating transport in hydrologic systems is critical for numerous applications including groundwater contamination (Cvetkovic et al, 2004;National Research Council, 1996;Neuman, 2005;O'Malley & Vesselinov, 2014a), hydrocarbon extraction (Hyman, Jiménez-Martínez, et al, 2016;Karra et al, 2015), and detection of underground nuclear explosions (Jordan et al, 2014). These applications often demand the use of large-scale models making the simulation of transport computationally demanding (Lichtner et al, 2015), so the use of computationally efficient graph-based models can provide significant benefits to techniques that require a large number of model runs such as model calibration (Lin et al, 2017) or uncertainty quantification (O'Malley & Vesselinov, 2014b).…”
Section: Reduced Order Models Of Dfns: Graph Flow and Transport Predimentioning
confidence: 99%
“…Lee et al () pointed out that GA can be improved further by using randomized low‐rank matrix approximations to precondition the linear GA update equations. Lin et al () demonstrated, similarly, that the scalability of GA can be substantially improved by using randomized sketching to reduce the effective size of the observation space.…”
Section: Introductionmentioning
confidence: 98%