Subsurface geology is highly uncertain, and it is necessary to account for this uncertainty when optimizing the location of new wells. This can be accomplished by evaluating reservoir performance for a particular well configuration over multiple realizations of the reservoir and then optimizing based, for example, on expected net present value (NPV) or expected cumulative oil production. A direct procedure for such an optimization would entail the simulation of all realizations at each iteration of the optimization algorithm. This could be prohibitively expensive when it is necessary to use a large number of realizations to capture geological uncertainty. In this work, we apply a procedure that is new within the context of reservoir management-retrospective optimization (RO)-to address this problem. RO solves a sequence of optimization subproblems that contain increasing numbers of realizations. We introduce the use of k-means clustering for selecting these realizations. Three example cases are presented that demonstrate the performance of the RO procedure. These examples use particle swarm optimization (PSO) and simplex linear interpolation (SLI)-based line search as the core optimizers (the RO framework can be used with any underlying optimization algorithm, either stochastic or deterministic). In the first example, we achieve essentially the same optimum using RO as we do using a direct optimization approach, but RO requires an order of magnitude fewer simulations. The results demonstrate the advantages of cluster-based sampling over random sampling for the examples considered. Taken in total, our findings indicate that RO using cluster sampling represents a promising approach for optimizing well locations under geological uncertainty.
This study presents a method based on the Gauss-Newton optimization technique for continuous reservoir model updating with respect to production history and time-lapse seismic data in the form of zero offset amplitudes and amplitude versus offset (AVO) gradients. The main objective of the study is to test the feasibility of using these integrated data as input to reservoir parameter estimation problems. Using only production data or zero offset time-lapse seismic amplitudes as observation data in the parameter estimation process cannot properly limit the solution space. The emphasis of this work is to use the integrated data combined with empirical knowledge about rock types from laboratory measurements, to further constrain the inversion process. The algorithm written for this study consists of three parts: the reservoir simulator, the rock physics petro-elastic model and the optimization algorithm. The Gauss-Newton inversion is tested at a 2D semi-synthetic model inspired by real field data from offshore Norway. The algorithm reduces the misfit between the observed and simulated data which make it possible to estimate porosity and permeability distributions. The Gauss-Newton optimization technique is an efficient parameter estimation technique. However, the numerical estimation of the gradient is time consuming, and it can be prohibitive for practical applications. This method is suitable for distributed computing which considerably reduces the total optimization time. The amount of reduction depends mainly on the number of available processors.
Decision making under uncertainty can be quite challenging, especially when complex numerical simulations are considered in the work flow and the decision has to be made relatively fast (e.g., in hours). This is the case when one needs to rank a given field portfolio within a limited budget and with acquisition constraints. If the ranking measure associated with each field is properly and rapidly evaluated, new prospect opportunities, which may lead to a favorable strategic position, can be readily identified.In this paper, we propose an efficient methodology for computing a "production-potential" measure that can be used to rank greenfield portfolios in the presence of geological uncertainty, quantifying both uncertainty and risk propagation. Next, we briefly describe the basics of the method proposed. First, uncertainty in sedimentary variability and flow behavior has to be characterized by a number of representative geological realizations. Sampling techniques are used to significantly reduce the number of realizations while preserving accuracy in the description and uncertainty propagation. Thereafter, multiple and varied field-development plans, based on primary/secondary-recovery mechanisms, are automatically generated while accounting for key parameters related to the number, drilling locations, and drilling sequence of wells. In these plans the reservoir is clustered by areas with similar production/injection potential, and the well locations and drilling schedules are obtained accordingly. The well controls are determined through estimations of the fieldrecovery factor. By means of experimental-design techniques a relatively small number of field-development plans are selected to capture the most significant production profiles. Each of these development plans is simulated for the realizations sampled previously, and the production-potential measure [e.g., average net present value (NPV) over all sampled realizations] is computed for all the plans. The highest of these measures (i.e., the best development plan) can be used for ranking the greenfield in the portfolio. Response-surface procedures are considered to perform additional analysis computations within iterative optimization procedures. It is important to note that other statistics related to the exploitation potential (e.g., standard deviation of the NPV) can also be used to complement the ranking, thereby mitigating the decision makers' risk tolerance. The methodology has been tested on the Brugge Field benchmark, which presents 104 realizations of multiple geological parameters. The benchmark has been modified to simulate a greenfield scenario. The ranking measure is the (discounted) NPV averaged over the 104 realizations. The proposed work flow yields a ranking measure of USD 5.43 billion, and the computational cost is approximately 1,900 simulations (performed in a parallel-computing environment). This NPV is somewhat higher than those found for the Brugge benchmark (with similar modified settings) by other researchers. To validate the result...
In this paper a decision-making approach that can be applied to problems that are relevant to the oil and gas industry is presented. This methodology is supported by state-of-the-art mathematical optimization algorithms, and is based on the formal integration of the decisions in question with well-studied optimization procedures. The integration of the methodology with the application adds to its robustness. Two different types of problems are formulated and solved. The first kind is based on deciding which wells have to be shut in during a given production interval whilst simultaneously optimizing the controls for each selected well. The second category involves deciding for a group of wells which ones have to be injectors or producers, and at the same time searching for optimal well locations. In all the results obtained we can systematically see that the set of decisions proposed by the integrated approach mean substantial improvement in field production. For example, in the first class of problems studied, the production oil target is satisfied, and up to 50 percent of produced water is saved with respect to the reference case. The huge amount of information available, for example, in Intelligent/Smart Fields or Closed-Loop Reservoir Management can be utilized for rigorously making solid decisions. In this work we put an emphasis on integration of real-life decisions with a realistic simulation-based mathematical optimization framework. This framework can be also useful for establishing a common language for decision makers and researchers within a given organization, and as a consequence endowing the decision-making process with agility and robustness. It should be stressed that ultimately it is human interpretation and intuition that drives the making of crucial decisions. Automated tools should be understood as an additional (and hopefully valuable) source of information for making these important decisions.
A control system that distributes fluids from an injection to a production well at an adjustable rate has attracted considerable interest in recent years. Several optimization algorithms have been developed for this system and these techniques have proved to be beneficial in reservoir development. In this study, we propose a discrete optimization approach to increase oil production from thin oil rim reservoirs by smart horizontal wells under waterflooding. The smart well was equipped with several on-off control valves that can be adjusted to reach optimum oil production by maximize sweep efficiency and delay water breakthrough. This bring into optimization problem that involves discrete possible choices with valves settings as decision variables. We perform Binary Integer Programming (BIP) in order to decide valves settings in the injection and production well. BIP is one of linear programming problem where variable required being 0 or 1. These variables correspond with on-off control valves. Preliminary results obtained with this methodology show a significant improvement in the oil recovery factor, and the water saturation at breakthrough is observed to be more uniformly distributed across the reservoir, when compared with the reference non-optimized case. In the second stage of this study we add uncertainty in the geological description of the reservoir (permeability distribution), and perform robust optimization. To this end, we consider statistics of the Net Present Value (NPV) in the optimization objective function. We maximize an average of the NPV, and we control the risk attitude by means of a penalty term that involves the standard deviation of that quantity. Introduction Oil was formed by geological processes millions of years ago and is typically found in underground reservoirs of dramatically different sizes, at varying depths, and with widely varying characteristics. The largest oil reservoirs are called "Super Giants", many of which were discovered in the Middle East. Because of their size and other characteristics, Super Giant reservoirs are generally the easiest to find, the most economic to develop, and the longest lived. The last Super Giant oil reservoirs discovered worldwide were found in 1967 and 1968. Nowdays large oil fields are already at a mature stage and the number of new significant oil fields found per year decreases gradually. Smaller fields are still regularly found, but at the current oil price it is often not economical to exploit them. As a direct result it becomes more and more difficult to maintain economic reserves at a desirable level. In the past, a variety of secondary oil recovery methods have been developed and applied to mature and depleted oil reservoirs. These methods help to improve oil recovery compared to primary depletion. The oldest secondary recovery method is waterflooding, since water is usually readily available and inexpensive. Fundamentally, waterflood involves pumping water through a well (injector) into the reservoir. The water is forced through the pore spaces and sweeps the oil towards the producing wells (producers). It is becoming increasingly necessary to produce these fields as efficiently as possible in order to meet the global increase in demand for oil and gas. Production optimization problems involving reservoir modeling with time was first attempted by Lee and Aronofsky [Lee 1958]. The purpose of their study was to apply linear programming procedure to oil production scheduling problems. Jansen [Jansen 02] found that for the Smart Stinger Completion (SSC) in thin oil rims, the optimum valve-settings changed over time, due to a drop in reservoir pressure caused by production. The SSC was both effective in delaying water and gas breakthrough and in coning control for the post water breakthrough stage. For optimal design of the SSC a reasonable knowledge on the permeability distribution along the well is required. Brouwer and Jansen [Brouwer 2004] studied the optimization of water flooding with fully penetrating, smart horizontal wells in 2-dimensional reservoirs with simple, large-scale heterogeneities. They used optimal control theory as an optimization algorithm for valve settings in smart wells.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.