Carbonate coreflood experiments have demonstrated the potential for Advanced Ion Management (AIMSM) to significantly increase oil recovery compared to waterflood using formation water. AIMSM involves adding and/or removing ions from the injection water to improve waterflood performance. AIMSM further improves current state-of-the-art processes through the addition of certain salts and/or the softening of water. Simulations have accurately matched the experiments, providing a tool to assess field-scale recovery and supporting the premise that the increased recovery is due to a change in wettability. This study shows that a relatively inexpensive and straightforward modification of injection water composition can significantly increase oil recovery.
Time-lapse (4D) analysis of legacy seismic data presents unique challenges, as neither the acquisition nor processing is designed for seismic monitoring. Two legacy seismic data sets from the Lena Field, Gulf of Mexico, are analysed for time-lapse effects. The analysis involves post-stack processing of the legacy seismic data including cross equalization and residual migration, and the definition of a new suite of 4D seismic attributes. These new attributes are used in both processing and interpretation. The time-lapse differences are interpreted using forward modeling and production data. The 4D difference anomaly is interpreted to be the result of gas cap expansion. The identification of potentially bypassed oil based on this interpretation may affect future drilling decisions. Introduction Seismic monitoring (time-lapse or 4D seismic) has the potential to significantly increase recovery in existing and new fields. However, there are many issues associated with the application of time-lapse seismic data. Two of the most significant are repeatability of the seismic data in the non-reservoir portion of the data volume and the robustness and credibility of the seismic difference anomaly within the reservoir (Ross et al. 1996, 1997). While future field developments should benefit from seismic acquisition designed for time-lapse monitoring, the portfolio of current seismic monitoring opportunities for most companies consists of existing fields for which one or more 3D seismic surveys have already been acquired. These legacy seismic data sets were not acquired for the purposes of seismic monitoring and are often very different in terms of acquisition and processing parameters. In addition, the seismic acquisition is rarely timed to optimally map reservoir changes or impact development decisions. Seismic repeatability is sufficient for time-lapse interpretation if the seismic differences in the region of interest are substantially greater than the differences outside the region of interest. The smaller the change in the seismic response due to production, the greater the repeatability required of the seismic data. Seismic modeling incorporating rock physics and reservoir simulation can help estimate the magnitude of reservoir changes but repeatability and interpretability can only be determined by the analysis of multiple seismic surveys. The main goal of this study is to understand the magnitude ofthe processing effort required to obtain reliable time-lapse differences. The reliability of the seismic difference is measured by repeatability in the seismic volume and the reconciliation of the time-lapse anomaly with geologic and production data. Geologic Setting The Lena Field (Mississippi Canyon Block 281) is located south of the modern Mississippi delta in 1,000 feet of water. The field is situated on the western flank of a salt dome within a fault-bounded intraslope basin. Hydrocarbon production is from six Pliocene-age sands. The B80 reservoir is located about 10,000 feet below SL at about 3 seconds seismic TWT. The interval is interpreted as a low-stand fan systems tract representing deposition in distributary lobes composed of amalgamated and channelized turbidities. The updip limit of the sands lies about 2,000 feet west of the salt flank and the reservoir thickens basin-ward to the west.
Asset performance can often be improved through continuous monitoring and/or through better utilization of information extracted from the high frequency data that are becoming more readily available in today's digital world. ExxonMobil has a long history of applying advanced technologies in asset management. Today, we continue to use new hardware, integrated software, and improved data infrastructure to enhance asset management workflows. ExxonMobil is taking an enterprise-wide approach (Reece et. al. 2008) to implementing digital technology in asset management. This paper presents four examples where ExxonMobil has taken advantage of high frequency data for timely asset management decisions. These four examples represent implementation at four different operational scales: for reservoir management, for well management, for facility management, and for plant management. The four examples are:For a West African reservoir: Permanent downhole pressure gauge data have added value in reservoir modeling that in turn provided a method for calculating reservoir rates;For South Texas gas wells: Real-time data access and charting capabilities were implemented and advanced data analysis explored to identify well events and manage well work-over activities aided by artificial intelligence;For Norwegian oil fields: An integrated facility model was developed and tuned for surveillance and operation of a network of wells and production facilities that are shared by multiple fields;For an Australian production plant complex: Production from offshore platforms, a gas plant, a crude stabilization plant, a fractionation plant, and a tank farm was optimized with high frequency data and automatic process control. Introduction ExxonMobil has a long history of applying advanced technologies in asset management. As early as 1967, a CPC (Computerized Production Control) program was implemented in the company's US production assets. More recently, various ExxonMobil business units have implemented permanent monitoring (Chorneyko, 2006), automated artificial lift systems, and other digital technologies for efficiency improvement. Since 2000, the company has moved toward global standardization and best practices. The EM2010 (Reece et. al., 2005) program, which is currently in progress, has developed a business-driven vision and roadmap to a more integrated and automated subsurface work environment. Today, advances in digital technology are delivering many software, hardware, and infrastructure improvements that provide capabilities in remote real-time data access, right-time data analysis, right-time visualization, right-time optimization, and on-demand remote operability. The intersection of digital technology and asset management is an area that, if exploited, can improve operation and recovery. We believe these opportunities will increase as the use of digital technologies in asset management grows and as digital technology allows semi-automation of workflows. The digital oil field has been a corporate strategic focus area since 2000 for many companies in the oil and gas industry. Major international oil companies have trademarked their approaches (e.g. Smart FieldsTM, Field of the FutureTM, and i-FieldsSM). ExxonMobil's approach to digital oilfields is to focus on the application of digital technologies to improve and enhance asset management workflows, fully leveraging our global functional organization and worldwide standardized asset management processes.
A number of new technologies have been developed offering new capabilities for reservoir monitoring. A group at ExxonMobil has been looking into the impact of these measurements on reservoir management. The result has been one vision for the instrumented oil field. Some key points of this vision are presented in this paper. Introduction We define an instrumented oil field as consisting of downhole and surface equipment designed to provide real time information about well and reservoir conditions. Although remote well control is sometimes included, our focus is on measurements, and in particular, permanent downhole measurements. We begin by outlining the business drivers for these measurements followed by how the new technologies have made these measurements feasible. Example applications of these measurements are given for downhole pressure, temperature and flow measurements followed by applications for permanently deployed seismic receivers. Finally, a summary of key points of the ExxonMobil model for the instrumented oil field is presented. ExxonMobil has some experience with downhole sensors. Since 1989, we have installed approximately 300 permanent downhole instruments in 10 different producing areas for numerous applications and environments. Main Business Driver The main business driver for the development of downhole sensor technology is to maximize the recovery of oil and gas. With continuous downhole measurements we can optimize field performance at both the reservoir level and for individual wells. With the information provided from downhole measurements we can also reduce, and possibly eliminate, unnecessary well intervention costs and the associated risk. Information can be obtained in time to make proactive decisions instead of reacting to crises as they occur, and this information is used to plan facilities upgrades and additional wells through the full life cycle of the field. Primary Reservoir Borehole Measurements New technology is making it possible to acquire real-time continuous measurements in the borehole. Fiber optics and other technologies have made it possible to monitor reservoir pressure, temperature, and eventually flow continuously in time, with denser spacing, greater accuracy and resolution. Downhole monitoring can aid reservoir optimization with continuous pressure monitoring. Measurements can be used to check and update the simulation model, and down hole measurements can aid in reservoir definition and description via production well tests. Figure 1 shows a production interference test. In this example, downhole pressure data are available for 80% of the wells. Pressure data are recorded at 1 second sample rate and are accessible by the Internet. The figure on the right is a map of reservoir porosity. The map indicates poor connectivity between well A and well B. A production interference test was run for these wells and the graph on the left shows the pressure measurements from a downhole sensor in well B over the time period after initiating production in Well A. The graph shows a 3.25 psi pressure drop over a 5 day period indicating greater connectivity between A and B then expected from the porosity map. Another important application for downhole monitoring is well optimization. Downhole sensors can provide earlier identification and diagnosis of production problems
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.