2016
DOI: 10.1002/nme.5342
|View full text |Cite
|
Sign up to set email alerts
|

A Stochastic Simplex Approximate Gradient (StoSAG) for optimization under uncertainty

Abstract: SUMMARYWe consider a technique to estimate an approximate gradient using an ensemble of randomly chosen control vectors, known as Ensemble Optimization (EnOpt) in the oil and gas reservoir simulation community. In particular, we address how to obtain accurate approximate gradients when the underlying numerical models contain uncertain parameters because of geological uncertainties. In that case, 'robust optimization' is performed by optimizing the expected value of the objective function over an ensemble of ge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
56
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
9

Relationship

2
7

Authors

Journals

citations
Cited by 146 publications
(61 citation statements)
references
References 43 publications
1
56
0
Order By: Relevance
“…For our study, it was a stochastic search direction. The StoSAG algorithm proposed by Fonseca et al [33] with the stochastic search direction, described as following, was used to maximize the augmented Lagrange objective function L a (u, λ, µ):…”
Section: Stosag Gradient Computationmentioning
confidence: 99%
See 1 more Smart Citation
“…For our study, it was a stochastic search direction. The StoSAG algorithm proposed by Fonseca et al [33] with the stochastic search direction, described as following, was used to maximize the augmented Lagrange objective function L a (u, λ, µ):…”
Section: Stosag Gradient Computationmentioning
confidence: 99%
“…The results indicated that CRM models have high potential to serve as a cogent proxy model for waterflooding related decision-making and obtain robust results that result in a near-optimal solution. Recently, a novel ensemble-based technique, stochastic-simplex-approximate-gradient (StoSAG), was developed by Fonseca et al [33,34]. The StoSAG deals with reservoir simulator as a black box and approximates gradient through the inputs and outputs of all the ensemble runs.…”
Section: Introductionmentioning
confidence: 99%
“…At each iteration of the so-called inner loop, the newly developed StoSAG optimization algorithm (Fonseca et al, 2016) is applied to solve the above-mentioned augmented Lagrangian function by considering hybrid nonlinear constraints.…”
Section: P1mentioning
confidence: 99%
“…For the use in robust optimization, see van Essen et al (2009); for a general overview, see Jansen (2011). Alternative, less codeintrusive, robust methods use approximate gradient and/or stochastic methods (Chen et al 2009;Chen and Oliver, 2010;Li et al 2013;Fonseca et al 2015Fonseca et al , 2016 or 'non-classical' methods such as, e.g., streamline methods (Alhutali et al 2008), evolutionary strategies (Pajonk et al 2011), or polynomial chaos expansions in combination with response surfaces (Babaei et al 2015), with further references given in Echeverrıa Ciaurri et al (2011).…”
Section: Application Case Reservoir Engineering -Long-term Reservoir mentioning
confidence: 99%