Experimental design method is an alternative to traditional sensitivity analysis. The basic idea behind this methodology is to vary multiple parameters at the same time so that maximum inference can be attained with minimum cost. Once the appropriate design is established and the corresponding experiments (simulations) are performed, the results can be investigated by fitting them to a response surface. This surface is usually an analytical or a simple numerical function which is cheap to sample. Therefore it can be used as a proxy to reservoir simulation to quantify the uncertainties. Designing an efficient sensitivity study poses two main issues:Designing a parameter space sampling strategy and carrying out experiments.Analyzing the results of the experiments. (Response surface generation) In this paper we investigate these steps by testing various experimental designs and response surface methodologies on synthetic and real reservoir models. We compared conventional designs such as Plackett-Burman, central composite and D-optimal designs and a space filling design technique that aim at optimizing the coverage of the parameter space. We analyzed these experiments using linear, second order polynomials and more complex response surfaces such as kriging, splines and neural networks. We compared these response surfaces in terms of their capability to estimate the statistics of the uncertainty (i.e., P10, P50 and P90 values), their estimation accuracy and their capability to estimate the influential parameters (heavy-hitters). Comparison with our exhaustive simulations showed that experiments generated by the space filling design and analyzed with kriging, splines and quadratic polynomials gave the greatest accuracy while traditional designs and the associated response surfaces performed poorly for some of the cases we studied. We also found good agreement between polynomials and complex response surfaces in terms of estimating the effect of each parameter on the response surface. Introduction Reservoir simulators are capable of integrating detailed static geological information with dynamic engineering data to represent the complex fluid flow in porous media. Therefore they have been used extensively for planning and evaluation of field development projects. Usually economical parameters such as net present value (NPV) or recovery estimates such as cumulative oil production are used to assess the value of different alternatives of a development study. Since most of the inputs to the simulation studies are usually uncertain and uncontrollable (like static reservoir properties), many sensitivity studies have to be performed, which might be prohibitive due to costly simulations. Experimental design methodology offers not only an efficient way of assessing uncertainties by providing inference with minimum number of simulations, but also can identify the key parameters governing uncertainty in economic and production forecast, which might guide the data acquisition strategy during the early phases of a field development project.[1] The commonly used workflow for this purpose is as follows:Define a large set of potential key parameters and their probability distributions.Perform a low level experimental design study, such as Plackett-Burman, which combines the high and low value of the key parameters.Perform simulations corresponding to each of the experiments.Fit the economical or recovery estimates obtained from simulations to a simple response surface, which is usually a line.Using the probability distributions attached to the parameters, perform a Monte Carlo simulation on the response surfaceGenerate a tornado diagram to rank the effect of each parameter on the economical or recovery estimates.Screen the heavy-hitters. From the tornado diagram.Perform a more detailed design such as full/fractional factorial, D-optimal, Box-Behnken, central composite, etc. with the heavy-hitters.Repeat steps 3 and 4.Perform a Monte Carlo simulation on the new response surface to get the probability density function (pdf) of the economical or recovery estimates.
History matching is the process of updating a petroleum reservoir model using production data. It is a required step before a reservoir model is accepted for forecasting production. The process is normally carried out by flow simulation, which is very time-consuming. As a result, only a small number of simulation runs are conducted and the history matching results are normally unsatisfactory.In this work, we introduce a methodology using genetic programming (GP) to construct a proxy for reservoir simulator. Acting as a surrogate for the computer simulator, the "cheap" GP proxy can evaluate a large number (millions) of reservoir models within a very short time frame. Collectively, the identified goodmatching reservoir models provide us with comprehensive information about the reservoir. Moreover, we can use these models to forecast future production, which is closer to the reality than the forecasts derived from a small number of computer simulation runs.We have applied the proposed technique to a West African oil field that has complex geology. The results show that GP is able to deliver high quality proxies. Meanwhile, important information about the reservoirs was revealed from the study. Overall, the project has successfully achieved the goal of improving the quality of history matuching results without increasing the number of reservoir simulation runs. This result suggests this novel history matching approach might be effective for other reservoirs with complex geology or a significant amount of production data.
The Britannia Field, located 210 kmn ortheast ofA berdeen,i st he largest LowerCretaceous condensategasaccumulation inthe NorthSea.The fieldspans four UK blocks(250km 2 )andisacombination structural/stratigraphic trap. Assessingthe remaininguncertaintyinafieldthe sizeofBritannia iscrucialtofuture development planningscenarios. Afterfiveyears offieldlife, ithasproved vitaltotake the 'reservoirpulse'and to evaluatethe keyreservoirparameters thatimpactthe futuredevelopment ofthe field.The Britannia reservoirconsists ofd eep-waterm ass-flow sandstones,rangingf rom high density 'clean' turbiditestonon-reservoirfaciesconsistingof'mixed-slurry' anddebrisflows. Post-depositionalremobilization andslumpingh avemodified the originaldepositionalfabric;however,awealthofc ored ata(over2 0000 ft), alongwith63 logged wells anddynamic production data, haveallowed refiningofthe flow unitdefinition within the reservoir. Arigorous re-correlation studybased on the integration ofbiostratigraphy,chemostratigraphyand pressuredatahasdefined anoverall sheet-like reservoirarchitecture, withsomeunits exhibitingamorecomplex channel-like geometry (Zones30/40).The Britannia reservoirteamhascapitalized on recent advancesinreservoirmodellingtechnology,includinga morerigorous approach to uncertainty analysisinboththe static anddynamic realm. The results from adynamic uncertainty studyhaveprovided keyinformation on the range andmagnitude ofkeypetrophysicalparameters and theirimpacto nr eservoirp erformance andultimaterecovery. Weighingthe impacto fadeterministic versus stochastic approach hasalso beenac riticalfactor int he faciesdistribution. The major uncertaintiesint he dynamic reservoirmodelthatimpactr eservesandrecovery factor are(1) originalgasinp lace, (2) fault and/or stratigraphic transmissibility 'baffles',(3) effectivepermeability,a nd, (4)condensateb anking.Int he static geologicalmodel,Jacta-aGocad uncertainty module-hasled to over300 realizations ofthe full fieldmodel. In the dynamic realm,e xperimentaldesignw asu sed to monitor reservoirp erformance bycapturingthe variables havinglargest impacto ndynamic flow behaviour andgasrecovery.DVD: CoredisplayE4 isrelevant to thischapterandcanbe viewed on the accompanyingDVD.Thiss tudyaddressesthe remaininguncertainty inBritannia Field afterfiveyears ofproduction history,a ndthe useofuncertainty modellingtools inboththe static andd ynamic realms,to address betterr eservoirp rediction inac omplext urbiditereservoir. This paperisanupdateof, andcomplement to,the previously published Britannia reservoirm odellingstudy( Jones etal .1 999) which relied solely on the static pre-production well information. The integration offiveyears ofproduction history hasallowed testing ofthe keyr eservoirp arameters int he dynamic simulation model so thatu ncertainty rangescanbe estimated andthesesame parameters canbe defined moreaccurately inthe static geological model.The Early Cretaceous Britannia reservoirhasbeenthe subjectof numerous publications. The most significant isathematic seto n the LowerCretaceous ...
TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractUnderstanding the impact of subsurface uncertainties on production responses is an integral part of the decision making process. A more accurate quantification of the uncertainty band around production forecasts contributes to better business decisions. Traditional experimental design workflows, where a limited set of models represent the key uncertainties in subsurface parameters, might be well suited for new field developments. However, when a field has been produced for several years, all models have to be conditioned to available production data in order to obtain meaningful predictions. Data integration and uncertainty assessment of future performance of the reservoir are indivisible processes that cannot be generally addressed by simple techniques.In this paper we present a method to tackle such complex inverse problems where highly non-linear responses are involved. The goal is to minimize an objective function that stands for the goodness-of-fit of the history-match. The key idea is to use high quality proxies of the objective function to accelerate the search for solutions. An efficient experimental design stage allow for the selection of key parameters while an optimization routine involving Genetic Algorithms (GA) is used to determine the best combinations of parameters. The models that reasonably honor the historical data are selected and provide an estimate of future production. The final distribution of the prediction variables defines the range of uncertainty conditioned to production history. The practicality of the methodology is demonstrated with a study of an offshore field in West Africa that has several years of complex production history.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.