The application of a genetic reservoir characterisation concept to the calculation of petrophysical properties requires the prediction of lithofacies followed by the assignment of petrophysical properties, according to the specific lithofacies predicted. Common classification methods which fulfil this task include discriminant analysis and backpropagation neural networks. While discriminant analysis is a well‐established statistical classification method, backpropagation neural networks are relatively new, and their performance in predicting lithofacies, porosity and permeability, when compared to discriminant analysis, has not been widely studied. This work compares the performance of these two methods in prediction of reservoir properties by considering log and core data from a shaly glauconitic reservoir. The neural network approach, while subject to a degree of trial and error as regards the selection of the optimum configuration of middle nodes, is shown to be capable of excellent performance. In the example problem considered, the neural network approach provided estimates superior to those based on a discriminant analysis approach. Further studies, on different formations, will be required to test the generality of this conclusion, and to refine the selection of neural network parameters.
These days "estimating uncertainty" is the mantra. As we do this, we ask ourselves which is better: an array of geologically simple rapidly history-matched models, or a single geologically comprehensive, carefully history-matched model. After all, uncertainty, which is normally characterized by a range of forecasts from techniques such as Experimental Design, is difficult to quantify using just one model however comprehensive it may be. Yet if forecasts are obtained from a series of simple models, how good are they? Choosing one over the other also has significant implications in the time required for modeling, and also reservoir management. Specific questions, that directly affect the cost of modeling, come to mind. These are: What is the optimal level of geological detail, especially if the uncertainty management plan includes history-matching and simulating a series of models? Can the oil trapped behind the flood front be estimated by a series of simple lateral (shales) barriers/baffles or do we always need an extensive sequence stratigraphy framework? A detailed model may not be as amenable to uncertainty estimation by the virtue of its size. Is field-scale history matching adequate or is well-by-well history matching a must? Perhaps the brute force approach of probabilistic forward modeling provides the panacea. After all the proof-of-the-pudding lies only in the model's ability to accurately pre-dict field performance. Finally, we also ask - should the mode-ling strategy, i.e., comprehensive vs. simple, be dependent on the response variable of interest, e.g. ultimate recovery factor vs. infill drill locations. As such, if ultimate recovery is the objective, a simple model may suffice. To answer questions like these, we re-visit current reservoir modeling paradigms. As a datum, we use a comprehensively modeled waterflood from Western Africa. This reservoir was modeled using extensive sequence stratigraphic techniques. The model was scaled-up from about 14 million cells to about 280,000 cells using a flow-based scale-up algorithm, carefully preserving all the mappable mudstones above flooding surfaces. History-matching for a 30-year period was systematically conducted with a team of field engineers and simulation specialists. The whole process took about a year to complete. Against this datum, we compare a series of rapidly built geological models that still honor all the data and the overall depositional architecture, and yet are significantly different from the datum geological model by the virtue of the modeling strategies implemented. The different modeling strategies vary in complexity from changes in variogram lengths and direction to simple tank models with stochastic sandstones and mudstones conditioned by well data. Various geostatistical algorithms were also investigated for facies modeling and petrophysical properties population within facies. The new models were history-matched using the conventional manual method and two separate assisted-history matching methods that use sensitivity coefficients. The question being addressed was: does constraining the geological models to the same dynamic data always create an imprint over the underlying geological variation and result in similar predictions? Preliminary results indicate that the history-matching overprint tends to mask some of the dramatic geological variations. This can have significant ramifications in modeling strategies, especially when assessing uncertainty in presence of substantial history. Introduction Reservoir characterization and modeling has evolved significantly over the past decade or so, with an ever growing emphasis around the "shared earth model" concept. This shared earth model needs to honor all log and seismic data, while maintaining a consistent picture of the depositional and the diagenetic character of the reservoir. On the other hand, where flow is of paramount importance, identification of appropriate flow paths that may or may not be purely dependent of the geological character, takes precedence. It is obvious that these two cannot be mutually exclusive, and yet with advances in reservoir characterization and geostatistics identification of unique flow paths still remains elusive.
With the increase of pipelines, corrosion leakage accidents happen frequently. Therefore, nondestructive testing technology is important for ensuring the safe operation of the pipelines and energy mining. In this paper, the structure and principle of magnetic flux leakage (MFL) in-line inspection system is introduced first. Besides, a mathematic model of the system according to the ampere circuit rule, flux continuity theorem, and column coordinate transform is built, and the magnetic flux density in every point of space is calculated based on the theory of finite element analysis. Then we analyze and design the disposition of measurement section probes and sensors combining both three-axis MFL in-line inspection and multi-sensor fusion technology. Its advantage is that the three-axis changes of magnetic flux leakage field are measured by the multi-probes at the same time, so we can determine various defects accurately. Finally, the theory of finite element analysis is used to build a finite element simulation model, and the relationship between defects and MFL inspection signals is studied. Simulation and experiment results verify that the method not only enhances the detection ability to different types of defects but also improves the precision and reliability of the inspection system.
Integration of lithofacies, inferred depositional environments and organic petrology of a lacustrine delta sequence in the Permo‐Triassic Gunnedah Basin of eastern Australia has provided a logical explanation for the patterns of distribution of the dispersed organic matter. The data show that: (i) sedimentary processes, such as size sorting and transportation, control the association of maceral assemblages in the dispersed organic matter with depositional environments; (ii) the extent of oxidation of organic matter increases downstream, from flood‐plain lake to interdistributary bay, delta front, prodelta and off‐shore lake deposits; (iii) coarse, woody material, mostly oxidised to semifusinite, is concentrated in the delta‐front environment; (iv) leaf material is transported preferentially to the prodelta, via the delta front. On the basis of organic matter type, the prodelta was probably the most favourable environment in this depositional system for the accumulation of oil‐prone source beds.
About fifty reservoir characterizations were built using seven different modeling methods, and by varying a combination of reservoir characterization parameters using Experimental Design. The methods include:simple sequential gaussian simulation (SGS);SGS with stratification;SGS with reservoir quality map trend and no stratification;SGS with reservoir quality map trend and stratification;sequential screening simulation with 3D trend;truncated gaussian simulation with trend andgeneral marked point process. These fifty models were all subjected to fluid flow simulation with an active aquifer and limited peripheral waterflooding. The results show that oil recovery at 95% water cut ranges from 29% to 34% in these fifty models despite the large apparent visual differences between them. However, there are large differences, some greater than 200%, in water breakthrough time, water cut and cumulative water production. Based on fluid flow performance, these fifty models are classified into three groups. Group 1 includes models constructed using short variogram ranges in Methods 1 to 4, or medium variogram ranges without stratification in Methods 1 and 3. They behaved more homogeneously and produced flow characterized by later water breakthrough and lower watercut. Group 2 are those models constructed using long variogram ranges in Methods 1 to 4, or medium variogram range with stratification in Method 2 and 4. They behaved more heterogeneously and produced early water breakthrough and high overall watercuts. Even though the mean permeability was similar in the oil zone for the two groups of models, the effects of continuity imposed by the variogram and stratification controlled this disparate behavior. Models of Methods 5 to 7 comprise Group 3, which produced the earliest water breakthrough and the highest overall watercuts. We interpret the difference in water breakthrough time and watercut between Group 2 and 3 to be a result of the higher average permeability (by 100–200 mD) in the oil zone of Group 3 models. Do more complex, geologically realistic models provide the best frameworks for predicting flow performance in sandy shoreface reservoirs with good well control? Some of the simplest models we constructed provided similar flow simulation results as the most complex and soft-data conditioned models. What we do observe, however, is that the more soft conditioning data used and the more complex geologic models tend to give a restricted range in full field flow performance. Introduction Investment decisions for field development are commonly based on future field performance predicted from static and dynamic reservoir simulations. The investment decision relies on the predictive capability of the reservoir models. Challenges to reservoir geology are identifying what types of reservoir heterogeneities are most relevant to fluid flows so that the right types and amount of data can be acquired; and how to build robust reservoir models using limited subsurface information typically in a short time frame. Overall in the oil industry and as described below, there is a feeling that more soft conditioning data and more complex modeling procedures make reservoir characterization models that provide better frameworks to predict flow performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.