The research presented in this paper was partly funded by the Integrated Systems Approach to Petroleum Production (ISAPP) and Recovery Factory (RF) projects.The 'Egg Model' is a synthetic reservoir model consisting of an ensemble of 101 relatively small three-dimensional realizations of a channelized oil reservoir produced under water flooding conditions with eight water injectors and four oil producers. It has been used in numerous publications to demonstrate a variety of aspects related to computer-assisted flooding optimization and history matching. Unfortunately the details of the parameter settings are not always identical and not always fully documented in several of these publications. We present a 'standard version' of the Egg Model which is meant to serve as a test case in future publications, and a dataset of 100 permeability realizations in addition to the permeability field used for the standard model. We implemented and tested the model in four reservoir simulators: Dynamo/Mores (Shell), Eclipse (Schlumberger), AD-GPRS (Stanford University) and MRST (Sintef), which produced near-identical output. This article describes the input parameters of the standard model. Together with the input files for the various simulators, it has been be uploaded in the 3TU.Datacentrum repository with free access to external users.
SUMMARYWe consider a technique to estimate an approximate gradient using an ensemble of randomly chosen control vectors, known as Ensemble Optimization (EnOpt) in the oil and gas reservoir simulation community. In particular, we address how to obtain accurate approximate gradients when the underlying numerical models contain uncertain parameters because of geological uncertainties. In that case, 'robust optimization' is performed by optimizing the expected value of the objective function over an ensemble of geological models. In earlier publications, based on the pioneering work of Chen et al. (2009), it has been suggested that a straightforward one-to-one combination of random control vectors and random geological models is capable of generating sufficiently accurate approximate gradients. However, this form of EnOpt does not always yield satisfactory results. In a recent article, formulate a modified EnOpt algorithm, referred to here as a Stochastic Simplex Approximate Gradient (StoSAG; in earlier publications referred to as 'modified robust EnOpt') and show, via computational experiments, that StoSAG generally yields significantly better gradient approximations than the standard EnOpt algorithm. Here, we provide theoretical arguments to show why StoSAG is superior to EnOpt.
SUMMARYWe consider robust ensemble-based multi-objective optimization using a hierarchical switching algorithm for combined long-term and short term water flooding optimization. We apply a modified formulation of the ensemble gradient which results in improved performance compared to earlier formulations. We also apply multi-dimensional scaling to visualize projections of the high-dimensional search space, to aid in understanding the complex nature of the objective function surface and the performance of the optimization algorithm. This provides insights into the quality of the gradient, and confirms the presence of ridges in the objective function surface which can be exploited for multi-objective optimization. We used a 18553-gridblock reservoir model of a channelized reservoir with 4 producers and 8 injectors. The controls were the flow rates in the injectors, and the long-term and short-term objective functions were undiscounted net present value (NPV) and highly discounted (25%) NPV respectively. We achieved an increase of 15.2% in the secondary objective for a decrease of 0.5% in the primary objective, averaged over 100 geological realizations. The total number of reservoir simulations was around 20000, which indicates the potential to use the ensemble optimization method for robust multi-objective optimization of medium-sized reservoir models.
With an increase in the number of applications of ensemble optimization (EnOpt) for production optimization, the theoretical understanding of the gradient quality has received little attention. An important factor that influences the quality of the gradient estimate is the number of samples. In this study we use principles from statistical hypothesis testing to quantify the number of samples needed to estimate an ensemble gradient that is comparable in quality to an accurate adjoint gradient. We develop a methodology to estimate the necessary ensemble size to obtain an approximate gradient that is within a predefined angle compared to the adjoint gradient, with a predefined statistical confidence. The method is first applied to the Rosenbrock function (a standard optimization test problem), for a single realization, and subsequently for a case with uncertainty, represented by multiple realizations (robust optimization). The maximum allowed error applied in both experiments is a 10° angle between the directions of the EnOpt gradient and the exact gradient. For the single-realization case we need, depending on the perturbation size, 900, 5 and 3 samples to estimate a "good" gradient with 95% confidence at 50 points in the optimization space for 50 different random sequences. For the robust case, the conventional EnOpt approach is to couple one model realization with one control sample, which leads to a computationally efficient technique to estimate a mean gradient. However, our results show that in order to be 95% confident the original one-to-one model realization to control sample ratio formulation is not sufficient. To achieve the required confidence requires a ratio of 1:1100, i.e. each model realization is paired with 1100 control samples using the original formulation. However, using a modified formulation we need a ratio of 1:10 to stay within the maximum allowed error for 95% of the points in space, though a 1:1 ratio is sufficient for 85% of the points. We also tested our methodology on a reservoir case for deterministic and robust cases, where we observe similar trends in the results. Our results provide insight into the necessary number of samples required for EnOpt, in particular for robust optimization, to achieve a gradient comparable to an adjoint gradient.
In an earlier study two hierarchical multi-objective methods were suggested to include short-term targets in life-cycle production optimization. However this earlier study has two limitations: 1) the adjoint formulation is used to obtain gradient information, requiring simulator source code access and an extensive implementation effort, and 2) one of the two proposed methods relies on the Hessian matrix which is obtained by a computationally expensive method. In order to overcome the first of these limitations, we used ensemble-based optimization (EnOpt).EnOpt does not require source code access and is relatively easy to implement. To address the second limitation, we used the Broyden-Flecther-Goldfarb-Shanno (BFGS) algorithm to obtain an approximation of the Hessian matrix. We performed experiments in which a water flood was optimized in a geologically realistic multi-layer sector model. The controls were inflow control valve settings at pre-defined time intervals. Undiscounted Net Present Value (NPV) and highly discounted NPV were the long-term and short-term objective functions used. We obtained an increase of approximately 14% in the secondary objective for a decrease of only 0.2-0.5% in the primary objective. The study demonstrates that ensemble-based hierarchical multi-objective optimization can achieve results of practical value in a computationally efficient manner.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.