Inorganic scale precipitation and deposition in oil and gas wells can cause significant production loss, which results in additional operational expenditure (OPEX) and health safety and environmental (HSE) risks. Scale management requires a detailed understanding of production rates, hydrocarbon and produced water compositions as well as reservoir conditions. Accurate real-time analysis of produced water compositions can immediately identifiy scaling risks in a production well and can lead to significantly reduced production loss, optimized chemical dosages, and fewer workovers, consequently lowering OPEX and mitigating HSE risk. This paper introduces development of a device capable of measuring the most critical parameters associated with inorganic scale in flowing produced water including pH, alkalinity, strontium, barium, sulfate, total hardness, total dissolve solids (TDS) and others. In order to measure these water properties with the device, different methods were tested, but eventually, a combination of spectrophotometric and other methods were determined effective. One of the challenges of using spectrophotometric methods is the reagent stability over time. Hence, customized reagents were prepared for this application and the stability of these reagents was tested over time. Specific calibration methods were designed in order to extend the usage of the reagents. Static measurements were initially performed and the results showed precise measurements of all the parameters. Results from dynamic tests utilizing real time flow and static test were in agreement and the accuracy was confirmed by traditional methods. Once the device prototype was built in our laboratories, production fluids were used to test the complete device. This device can be placed at various attachment points from the wellhead to the separator. This automated device is capable of collecting a discrete production fluid sample, separating produced water from the bulk phase and measuring various properties of produced water. These properties are reported electronically and used as part of a combined real time scale risk prevention system. In addition, this device measures parameters while maintaining wellhead pressure and temperature in order to eliminate the potentials errors in measurements, for instance pH of water changes due to degassing and precipitation as a result of changes in pressure and temperature. A field trial is planned to test the device under full flowing conditions. This will be the first automated real-time produced water composition monitoring device with high measurement accuracy while maintaining pressure and temperature of samples, which can be attached at various points from wellhead to separator. This can be beneficial to identify the scaling risk in production wells before severe scaling occurs. The device is designed to enhance reliability of water properties measurements, provide real-time measurements, and reduce downtime and costs associated with scale problems and sampling.
Mineral scale formation in the oil and gas industry can be detrimental to production and injection wells, and facilities yielding production losses and integrity issues. In order to manage scaling risk during operation, knowledge of production rates, GOR, WCT%, and hydrocarbon and produced water compositions along with reservoir pressure and temperature are required. Scale management relies on accurately monitoring produced water composition. In addition, the timing of changes in water composition is critical to proactively protect the production system from scaling. Preventing scale formation results in significantly reduced production loss, optimized chemical dosage, and fewer well interventions, consequently lowering OPEX and mitigating HSE risks. This paper documents advancements made since the last year publication by the authors on the real-time produced water composition measurement device. The device can automatically collect a discrete production fluid sample at the wellhead while keeping the pressure and temperature of sample constant, separating produced water from the bulk phase and measuring multiple properties: pH, alkalinity, strontium, barium, calcium, sulfate, total hardness, and total dissolve solids (TDS) at the pressure and the temperature of sampling point without degassing. It also reports the pressure and temperature of sampling point. This paper illustrates calcium automated measurement properties developed and a validation of the laboratory prototype by testing the device with four produced water samples from different formations: i) Permian Basin, ii) Bakken, iii) Eagle Ford; and iv) synthetic produced water from an oil field in Abu Dhabi. Results show excelent performance of the device for calcium measurement. For example, the mean of the device measured value for calcium based on internal calibration, was 42.84 mg/l with a standard deviation of 0. 084 compared with the laboratory reference method ICP-OES measurement of 42.63 mg/l, which shows very accurate measurement of calcium. This paper presents the reliability and repeatability of the device in addition to the results accuracy compared with ICP-OES and traditional laboratory analysis methods. The results obtained from production fluids were promising. Laboratory validation is the last step before field testing the device. The success of the laboratory validation stage is crucial as the device progresses through final design changes ahead of field trials. Measuring produced waters with different salinities promote the device reliability and fills the gap between synthetic water composition measurement in laboratory and the field trial. The device was developed to enhance reliability of water properties measurements, provide real-time measurements, and reduce downtime and costs associated with workovers, interventions, and sampling.
To achieve the new increased production rate of a giant oilfield offshore Abu Dhabi, the operator started implementing a number of new technologies which push back the boundaries of technical expertise. Foremost among these is the construction of four artificial islands from which wells are being drilled using extended reach drilling (ERD) technology and the use of maximum reservoir contact (MRC) wells to maximise potential production. To ensure that the maximum benefit was obtained from these MRC wells, an extensive research and testing program was conducted to select an optimised completion fluid for use in the island wells to improve and maximise production. The resulting selected sodium bromide/sodium chloride (NaBr/NaCl) based completion fluid was used in the pilot MRC wells and is now being used successfully in new wells on the islands with significant improvement in production. However, shortly after the start of production from one well, an unexpected production upset occurred with the appearance of significant quantities of solids at the surface non-return valve (NRV). Laboratory analysis of these solids confirmed that they consisted primarily of mineral scales, with sodium bromide (NaBr) and sodium chloride (NaCl) predominant. To avoid major disruptions in the production system, the following key questions during the investigation required urgent answers: Why were these salts depositing in this production system?Why were similar issues not reported in other wells using the same completion brine?Were there any specific differences in this well or its operation that could cause the solids being formed?Are any mitigation measures required for the continued use of the NaBr/NaCl completion brine in the island wells? This paper describes the detective work undertaken to investigate the unexpected appearance of these solids in the production system. This involved a combination of field data review, mineral scale prediction modelling and laboratory tests and analysis. The cause of the solids deposition was identified as being due to the interaction of the completion brine with the hydrocarbon phase at very low water cuts. It should be noted that other wells are also being produced at low water cuts without any similar issues occuring. By investigating and understanding this problem more fully, a set of guidelines and improved testing protocol could be developed to use in any repeat investigation. At the time of writing this paper no further problems have occurred in any of the subsequent wells drilled and produced.
A new generation of sophisticated wells are to be drilled in a giant offshore field in Abu Dhabi. These extended reach wells will reach and produce remote areas of the field from artificially built islands. The first maximum reservoir contact (MRC) pilot well was drilled successfully with 20,000 ft total depth and a 10,000 ft producing interval completed with inflow control devices (ICDs). The well started producing dry oil at reasonably high flow rate and uniform contribution from each ICD compartment. After a relatively short period of dry oil production water break-through occurred and started causing unexpected downhole flow assurance concerns, especially at the locations of the ICDs. Production logging along with caliper data showed severe corrosion around water producing ICD nozzles and high gamma ray readings in the water producing intervals, suggesting that scale had precipitated behind pipe either in the near wellbore rock (believed to be unlikely based on the history of scale precipitation in the field) or on the outside of the completion (thought to be more likely). An investigation was launched to determine the cause of the corrosion around the ICD nozzles and improve understanding of the influence of ICD flow dynamics on scale deposition and material corrosion, with the objective of developing a suitable mitigation method for scale and corrosion in future MRC/ICD wells. The study results showed that high wall shear stress developed in and around the water producing ICD nozzle area are at the origin of the corrosion in the pilot well, and that low-to-moderate wall shear stresses on and around the wire-wrapped screen sections of the ICDs could be the cause of the scale deposition. Consequently, the completion design for future wells with ICD completions has been revised to mitigate the risks of flow induced corrosion. This paper will present results of the laboratory study into the influence of ICD flow dynamics on corrosion rates and mineral scale precipitation and discuss strategies for mitigating both concerns via the design of the ICD.
Most of the existing wells in a giant oil offshore field in Abu Dhabi are equipped with L80-13Cr corrosion resistant alloy (CRA) tubulars to provide protection from CO2 corrosion due to sweet nature of reservoir. Recently, some of the wells are showing a presence of mild H2S due to unexpected reservoir souring or other geological changes. The presence of H2S in production fluids raises concerns about sulfide-stress-cracking (SSC) of L80-13Cr. As L80-13Cr CRA has been known to have limited SSC resistance, it is important to understand the maximum acceptable limit of H2S in production fluids for safe operation. Industry standards such as ISO15156/ NACE MR0175 and NORSOK-M-001 recommend safe acceptable limits of H2S for 13Cr tubular materials based on the partial pressure of H2S. However, these approaches do not take into account the effect of temperature, or non-ideal gas behavior of H2S at high pressure. Pressure, temperature, salinity and pH in the wellbore impact the solubility and chemical behavior of H2S in the water phase which defines the corrosive environment to which the material is exposed. Therefore, it is important to include non-ideal gas and solution behaviors in order to define the acceptable limit of H2S for fitness-for-service (FFS) material evaluations. In this work the acceptable limit of H2S in the wellbore was determined using a combination of thermodynamic modeling and field corrosion data. A molecular thermodynamics approach was used to calculate pH and dissolved H2S levels in water along the production tubing length. Shut-in and production operation scenarios were simulated to identify the worst-case scenario using thermal modeling software. Furthermore, tubing inspections were conducted using a multi-finger caliper tool to identify any corrosion damage. All of this information was used to identify the acceptable limit for H2S in the wellbore. This approach to determining acceptable H2S limits will avoid unnecessary workovers and enables cost saving through continued use of existing materials. Furthermore, it supports the development of a corrosion monitoring plan, and FFS assessment of tubulars based on the wellbore environment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.