Distinguished Author Series articles are general, descriptive representations that summarize the state of the art in an area of technology by describing recent developments for readers who are not specialists in the topics discussed. Written by individuals recognized to be experts in the area, these articles provide key references to more definitive work and present specific details only to illustrate the technology. Purpose: to inform the general readership of recent advances in various areas of petroleum engineering. Summary Parts 1 and 2 of this series of articles presented a general overview of artificial neural networks and evolutionary computing, respectively, and their applications in the oil and gas industry.1,2 The focus of this article is fuzzy logic. The article provides overview of the subject and its potential application in solving petroleum-engineering-related problems. As the previous articles mentioned, the most successful applications of intelligent systems, especially when solving engineering problems, have been achieved by use of different intelligent tools in concert and as a hybrid system. This article reviews the application of fuzzy logic for restimulation-candidate selection in a tight-gas formation in the Rocky Mountains. We chose this particular application because it uses fuzzy logic in a hybrid manner integrated with neural networks and genetic algorithms. Background The science of today is based on Aristotle's crisp logic formed more than2,000 years ago. Aristotelian logic looks at the world in a bivalent manner, such as black and white, yes and no, and 0 and 1. The set theory developed in the late 19th Century by German mathematician Cantor was based on Aristotle'sbivalent logic and made this logic accessible to modern science. Subsequent superimposition of probability theory made the bivalent logic reasonable and workable. Cantor's theory defines sets as a collection of definite, distinguishable objects. Fig. 1 is a simple example of Cantor's set theory and its most common operations, such as complement, intersection, and union. The first work on vagueness dates back to the first decade 20th Century, when American philosopher Pierce noted that "vagueness is no more to be done away with in the world of logic than friction in mechanics." 3 In the early 1920's, Polish mathematician and logician Lukasiewicz4developed three-valued logic and talked about many-valued, or multi-valued, logic. In 1937, quantum philosopher Black5 published a paper on vague sets. These scientists built the foundation on which fuzzy logic was later developed. Zadeh,6 known as the father of fuzzy logic, published his landmark paper "Fuzzy Sets" in 1965. He developed many key concepts, including membership values, and provided a comprehensive framework to apply the theory to engineering and scientific problems. This framework included the classical operations for fuzzy sets, which comprise all the mathematical tools necessary to apply the fuzzy-set theory to real-world problems. Zadeh was the first to use the term "fuzzy," which provoked much opposition. A tireless spokesperson for the field, he was often harshly criticized. At a 1972 conference, Kalman stated that "Fuzzification is a kind of scientific permissiveness; it tends to result in socially appealing slogans unaccompanied by the discipline of hard scientific work." 7 (Note that Kalman is a former student of Zadeh'sand inventor of the famous Kalman filter, a major statistical tool in electrical engineering. The Kalman filter is the technology behind the Patriotmissiles used in the Gulf War. Claims have been made that it has been proved that use of fuzzy logic can significantly increase the accuracy of these missiles.8,9) Despite all its adversaries, fuzzy logic continued to flourish and has become a major force behind many advances in intelligent systems. The word "fuzzy" carries a negative connotation in Western culture, and" fuzzy logic" seems to misdirect the attention and to celebrate mentalfog.10 On the other hand, Eastern culture embraces the concept of coexistence of contradictions as it appears in the yin/yang symbol (Fig.2). While Aristotelian logic preaches A or Not-A, Buddhism is all about Aand Not-A. Many believe that the tolerance of Eastern culture for such ideas is the main reason behind the success of fuzzy logic in Japan. While fuzzy logic was being attacked in the U.S., Japanese industries were busy building amultibillion-dollar industry around it. Today, the Japanese hold more than2,000 fuzzy-related patents. They have used fuzzy technology to build intelligent household appliances, such as washing machines and vacuum cleaners(Matsushita and Hitachi), rice cookers (Matsushita and Sanyo), air conditioners(Mitsubishi), and microwave ovens (Sharp, Sanyo, and Toshiba), to name a few. Matsushita used fuzzy technology to develop its digital image stabilizer for camcorders. Adaptive fuzzy systems (a hybrid with neural networks) can be found in many Japanese cars. Nissan patented a fuzzy automatic transmission that is now very popular with many other manufacturers, such as Mitsubishi and Honda.10
Today applications of drilling require proper identification of operations where a cost reduction is possible. Many indicators are present when one tries to optimize the drilling operations such as casing size and mud properties. On the other hand the selection of the optimum bit requires information from a variety of sources. The parameters affecting the bit performance are complex and their relationship is not easily recognized. The general trend is to evaluate the performance of the bit from an offset well. A new methodology was developed to model the rate of penetration and bit wear under various formation types and operating parameters. This method introduces a new approach with improved bit wear prediction. A simulator was used to generate drilling data to eliminate errors coherent to field measurements. The data generated was used to establish the relationship between the complex patterns such as weight on bit, rotary speed, pump rates, formation hardness, and bit type. The method was tested using data from runs conducted with a rig floor simulator. The validity of the proposed method was also demonstrated with data from an existing field. Introduction The success and hence the economics of a drilling operation depends on the condition of the bit. With bits performing at high penetration rates, the well drilling costs can be lowered. Thus, the selection of a proper bit type and the operating parameters are important challenges one faces during the drilling operations. Work performed by several investigators have shown that many bit and fluid components affect the penetration rates. Different methods can be utilized in the optimization of drilling. Researchers proposed the use of empirical correlations and predictive techniques. In these approaches either laboratory data were used to derive the empirical correlations or offset well data was used to fine tune the predictive method. Neural Networks. Neural networks have been successfully used in different fields due to their capability to identify complex relationships when sufficient data exist. Recently, they have been successfully applied to different areas of petroleum engineering such as multi-phase pipe flow, reservoir characterization, production, and drill bit diagnostics. The neural network developed to diagnose the drill bit used six parameters consisting of lithology (or formation type), torque, rate of penetration, weight on bit, rotational speed, and hydraulic horsepower per square inch of nozzle as input. The network was trained to predict the bit wear as output. The use of formation type or lithology introduces errors for conditions where the predicted formation types and depths differ from the predicted properties. Although the drill bit diagnosis network was successful it was based on laboratory data and did not cover all formation hardness and bit grade levels, thus limiting its applicability. In this study, we introduce a new approach to predict a drilling parameter such as the rate of penetration by designing a new neural network. Approach A new methodology is introduced to predict the ROP values during drilling. This approach uses the measured data to determine the relationship between several parameters like bit type, weight on bit, depth, and rotary speed recorded during the drilling operations. Two different data sets were used in this study. The first data set consisted of approximately 8,000 measurements taken at selected wellbore conditions. The rig floor simulator available in the departmental facilities were employed for this purpose. The use of simulated data provided additional insight in terms of parameters like formation abrasiveness, bit tooth wear, and bit bearing wear as a function of drilling time that are commonly not possible to measure in the field. The second data set consisted of approximately 500 measurements from several wells in the United States. Simulated Data. Runs were conducted using a rig floor simulator and data were continuously recorded until bit fails either due to bearing wear or tooth wear. The data set contained approximately 8,000 measurements taken at predesigned wellbore conditions. The simulated data were chosen in this study to eliminate errors inherent to data acquired in the field. Table 1 shows the recorded data types and their range. P. 175^
Reservoir simulation models are the major tools for studying fluid flow behavior in hydrocarbon reservoirs. These models are constructed based on geological models, which are developed by integrating data from geology, geophysics, and petro-physics. As the complexity of a reservoir simulation model increases, so does the computation time. Therefore, to perform any comprehensive study which involves thousands of simulation runs, a very long period of time is required. Several efforts have been made to develop proxy models that can be used as a substitute for complex reservoir simulation models. These proxy models aim at generating the outputs of the numerical fluid flow models in a very short period of time. This research is focused on developing a proxy fluid flow model using artificial intelligence and machine learning techniques. In this work, the proxy model is developed for a real case CO2 sequestration project in which the objective is to evaluate the dynamic reservoir parameters (pressure, saturation, and CO2 mole fraction) under various CO2 injection scenarios. The data-driven model that is developed is able to generate pressure, saturation, and CO2 mole fraction throughout the reservoir with significantly less computational effort and considerably shorter period of time compared to the numerical reservoir simulation model.
In this paper a new series of type curves that can be used in the characterization of coal bed reservoirs are presented. The proposed type curves take into consideration the presence of water in the coal seam and its coproduction with gas. The developed type curves are capable of handling the presence of the non-linear desorption phenomenon which plays a major role in the flow dynamics of methane in coal seams.A previously developed numerical model has been instrumental in the construction of the new type curves. During the development phase, gas and water equations were collapsed into a single expression. This consolidated transport expression was then put into a dimensionless form and necessary dimensionless groups were identified. The terms which relate to desorption process were included within the total system compressibility. This was made possible through a newly developed sorption functional group.The proposed type curves assume radial flow geometry around a degasification well which fully penetrates the formation. These type curves are generated for constant pressure specifications at the inner boundary. Coal seam initially is assumed to be in saturated state.The proposed type curves were tested extensively using a wide range of coal properties including desorption characteristics, and were identified with unique characteristics. The uniqueness of the type curves are preserved throughout the different flow regimes.The proposed type curve solution for coal bed reservoirs undergoing two-phase flow can be instrumental in analyzing the constant pressure drawdown test data from a degasification well.
TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractIn 1996, the Gas Research Institute (GRI) performed a scoping study to investigate the potential for natural gas production enhancement via restimulation in the United States (lower-48 onshore). The results indicated that the potential was substantial (over a Tcf in five years), particularly in tight sand formations of the Rocky Mountain, Mid-Continent and South Texas regions.However, it was also determined that industry's current experience with restimulation is mixed, and that considerable effort is required in candidate selection, problem diagnosis, and treatment selection/design/ implementation for a restimulation program to be successful. Given a general lack of both specialized (restimulation) technology and "spare" engineering manpower to focus on restimulation, GRI initiated a subsequent R&D project in 1998 with several objectives. Those objectives are to 1) develop efficient, cost-effective, reliable methodologies to identify wells with high restimulation potential, 2) identify and classify various mechanisms leading to well underperformance, 3) develop and test non-fracturing restimulation techniques tailored to selected causes of well underperformance and, 4) demonstrate that, with improved technologies in these key areas, restimulation is a viable and attractive approach to improve well recoveries and economics.The approach adopted for the R&D program is a combination of candidate selection methodology development, conceptual well underperformance/problem classification, laboratory studies, and actual field experiments and demonstrations of restimulation treatments. At this time, a multi-process candidate selection methodology has been developed, consisting of production comparisons, engineering based performance assessments, and pattern recognition technology. Also incorporated into the overall methodology are individual well reviews, economic analysis, and a new short-term field test for candidate verification. Laboratory studies have also identified new procedures for effective clean-up of unbroken gel in propped and natural fractures.In total, twenty actual restimulation treatments are planned at four separate test sites. Currently active sites are in the Rocky Mountain and Mid-Continent regions. One site is located in the Big-Piney/LaBarge Producing Complex in the northern Moxa Arch area of the Green River Basin. As of this writing, three restimulation treatments have been performed at this location.The second site is the combined Rulison, Parachute and Grand Valley fields in the Piceance Basin. Candidate selection has been completed, and actual field-testing and restimulation activities are expected to begin in July 1999. The third site is the Carthage field in East Texas. Candidate selections are complete at this site also, with field activities also scheduled to begin in July. The fourth test site, not yet active, is in the Wilcox Lobo Trend of South Texas. This paper is the first comprehensive publication of results from this recent GRI init...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.