Summary A simple model suitable for hand calculations is presented to predicttemperature profiles in two-phase flowing wells. The model, developed withmeasured temperature data from 392 wells, assumes that the heat transfer withinthe wellbore is steady-state. Comparisons between the model's predictions andfield data indicate that the model is highly accurate within its range ofapplication. Introduction Predicting accurate temperature profiles in flowing wells can Predictingaccurate temperature profiles in flowing wells can greatly improve the designof production facilities in petroleum engineering. Temperature profiles helpcalculate accurate two-phase-flow pressure-drop predictions, which in turn canimprove an artificial-lift system design. Gas-lift design can be enhanced bymore accurate prediction of temperature at valve depth. In this way, thevalve's dome pressure can be set more accurately, thereby improving thepredictability of valve throughput. Existing temperature correlations are ofteninaccurate because they do not consider the effects of different fluids in theannulus and the cooling and heating of the fluid resulting from phase change. Rigorous theoretical models are often complex and inconvenient. They depend onmany variables and require information about fluid composition. This paperdescribes two methods for predicting the temperature profile in a flowing well. The first is a model derived from the steady-state energy equation thatconsiders the heat-transfer mechanisms found in a wellbore. The second is asimplified version of the model intended for hand calculations. An extensivedata bank of temperature profiles from 392 wells was used in itsdevelopment. Literature Review One of the earliest works on predicting temperature profiles in a flowingwell was presented by Kirkpatrick. He presented a simpleflowing-temperature-gradient chart that can be used to predict gas-lift valvetemperatures at the injection depth. Much of the classic work in this area wasdeveloped by Ramey, who presented approximate methods for predicting thetemperature of either a single-phase incompressible liquid or a single-phaseideal gas flowing in injection and production wells. Satter later improved Ramey's method by considering phase changes that occur within steam-injectionprojects. Shiu and Beggs simplified Ramey's method by correlating for aspecific coefficient in Ramey's equation. Willhite gave a detailed discussionof the overall heat-transfer mechanism in an injection well, and Coulter and Eardon developed a method for predicting temperatures in gas transmissionlines. Complex theoretical models, such as those by Zelic and modified Ramey'smethods, can be used to predict temperature profiles in flowing wells. Allthese methods require additional information about the fluid mixturecomposition. In addition, these methods are computationally complex and requirethe use of a computer. Such models are ideal for predicting temperatureprofiles associated with more difficult problems-e.g., a well flowing retrygrade condensates. The model developed in this paper is based on the Coulter-Bardon equation and incorporates Ramey's and Willhite's heat-transfermechanisms in a wellbore.
Summary We present a method to integrate log, core, and well-test pressure data to describe reservoir heterogeneities. The conditional simulation method of simulated annealing is used to incorporate diverse sources of data. We use analytical solutions for radially heterogeneous reservoirs to define an equivalent radial permeability and a corresponding region of investigation. By numerical experimentation on drawdown well-test simulations in heterogeneous permeability fields, we determine that a weighted-area-based geometric average of the gridblock permeabilities within the region of investigation best defines the equivalent radial permeability. This information, along with the spatial statistics from core/log data, is coded into the overall objective function of the simulated annealing algorithm to yield a consistent reservoir description. Introduction Integration of all available data in describing a reservoir will yield a more accurate prediction of reservoir performance. The conventional approach to reservoir description by use of geostatistics can readily accommodate the use of static (core and log) data, specifically permeability and porosity data. The statistics of the sample data can be defined, and the support volume is a function of the size of the retrieved core or the depth of investigation of the logging instrument. Permeability and porosity values can be upscaled to represent gridblocks for the purpose of reservoir simulation studies. Well-test interpretation techniques provide useful information on a scale larger than core or log data. On this scale, we can determine faults, drainage boundaries, and an average reservoir permeability. On a smaller scale, we can determine wellbore damage and whether the system is single or dual porosity. Conventional geostatistical methods are unable to incorporate well-test data. The support volume represented by a well-test permeability needs to be determined, as well as a procedure that relates the well-test derived permeability to the distribution of small-scale permeabilities within the reservoir. We present a methodology to include well-test data for the description of small-scale heterogeneities.
TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractPermeability is a critical reservoir parameter that defines well and reservoir performance. An accurate knowledge of permeability is critical to forecasting rates and designing production facilities. The permeability determined from different sources can vary significantly, resulting in inconsistent estimates of well and/or reservoir performance.In this paper, permeability values and evaluation methods, are discussed and verified with field data. It is shown how NMR derived permeabilities calibrated to MDT or comprehensive core data can be used to define well/reservoir performance. By accounting for the effects of overburden, as well as, pore pressure when measuring permeability in core tests, more representative values of permeability can be determined. These values of permeability can be used to accurately calibrate NMR permeability data.The simulation and calibration results are compared to measured well test data. Results from this work show that if these various sources are integrated properly an accurate well productivity model can be achieved.
Real Time production workflows introduce significant changes to the production management practices. The introduction of new technologies such as affordable satellite communication, local field area networks, global connectivity to the internet, collaborative workspace (both virtual and actual war room concepts) and data mining agents have all contributed a number of enhancements in the production arena. Although the early gains obtained during the implementation of real time workflows can be as large as increase of 50% of production in some assets, the long term gains are typically in the order of 3 to 8 %. The paper provides a number of approaches to quantify the value of enabling traditional production processes with real time information and activities. This paper also presents some inroads and lessons learnt in terms of the deployment of a new breed of real time services offered to support daily production management activities. These Activities include: Electrical Submersible Pump (ESP) monitoring and surveillance, real time productivity monitoring based on interpretation of downhole sensors and multiphase flowmeters. The analysis of over 600 installations worldwide provide insights of solutions and opportunities in varying environments covering a wide range of challenges such as North American cost sensitive context to Colombian security conscious field operations and harsh environment deployments. Among the main challenges of the real time infrastructure for the production world, is the infamous management of legacy systems (old and aging SCADA systems, archaic instrumentation, challenging production sharing agreement financial terms). The paper offers a number of recommendations, and establishes a methodology to the deployment of workflows power by real time information. A short discussion of the relative benefits of hosted services versus fully integrated solutions housed within the asset IT infrastructure provides key elements to operators to decide what would be the long term cost effective solution to be selected for a given asset. Introduction The need for real-time or relevant data feed to support production workflows is not new. Permanent monitoring devices have been in use in the oil field industry since the beginning of the century for surface applications (earlier in the form of pressure and temperature strip charts). The field of downhole instrumentation has been tackled in various ways since the early 60s. Engel (1963) introduced an electromechanical gauge and showed data from 1959–1961. Since the mid 70s permanent electronic gauges have been deployed in wells on dry completions, and subsea in the early 80s (Bezerra et al, 1992). The value of such measurements was well recognized at that time, and it is not surprising that they are becoming more and more popular in the upstream oil and gas arena. The recent evolution of various information technologies brings new possibilities for deploying monitoring services to facilitate production workflow. The Digital Oil Field of the Future (DOFF) study, Cambridge Energy Research Associates, Inc., (CERA, 2003) identifies monitoring and control as one of five key technologies that will impact the oil and gas industry in the future. With a number of such implementations already in place, it is now possible to provide examples and recommendations for a successful realization of these technologies in the production environment. However, the full realization of the value of any real time workflows lies beyond the availability of an organized and contextualized form of the production data sets. In other words, this is not just a data management issue. Automation of data acquisition, including data validation and data preparation, are key aspects of a real-time workflow. Likewise, the systematic computation of key performance indicators and the ability to generate events based on these computations, such that alarm status of well and network performance can be updated automatically, are the gems that benefit production management.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.