We have developed and implemented a robust and practical scheme for anisotropic 3D acoustic full-waveform inversion (FWI). We demonstrate this scheme on a field data set, applying it to a 4C ocean-bottom survey over the Tommeliten Alpha field in the North Sea. This shallow-water data set provides good azimuthal coverage to offsets of 7 km, with reduced coverage to a maximum offset of about 11 km. The reservoir lies at the crest of a high-velocity antiformal chalk section, overlain by about 3000 m of clastics within which a low-velocity gas cloud produces a seismic obscured area. We inverted only the hydrophone data, and we retained free-surface multiples and ghosts within the field data. We invert in six narrow frequency bands, in the range 3 to 6.5 Hz. At each iteration, we selected only a subset of sources, using a different subset at each iteration; this strategy is more efficient than inverting all the data every iteration. Our starting velocity model was obtained using standard PSDM model building including anisotropic reflection tomography, and contained epsilon values as high as 20%. The final FWI velocity model shows a network of shallow high-velocity channels that match similar features in the reflection data. Deeper in the section, the FWI velocity model reveals a sharper and more-intense low-velocity region associated with the gas cloud in which low-velocity fingers match the location of gas-filled faults visible in the reflection data. The resulting velocity model provides a better match to well logs, and better flattens common-image gathers, than does the starting model. Reverse-time migration, using the FWI velocity model, provides significant uplift to the migrated image, simplifying the planform of the reservoir section at depth. The workflows, inversion strategy, and algorithms that we have used have broad application to invert a wide-range of analogous data sets.
In 2010, the world's largest optical permanent reservoir monitoring (PRM) system was installed in the southern part of the Norwegian North Sea at Ekofisk. The life-of-field seismic (LoFS) system consists of 3966 seabed multicomponent sensors along 200 km of mostly trenched fiber-optic seismic cables and covering about 60 km 2 of Ekofisk field. Seismic data are acquired via a topside recording unit and a containerized source operated on a supply vessel. Six vintages of data were acquired between the end of 2010 and fall 2013. Different aspects of seismic operations at Ekofisk include seismic source, recording system, data transfer, quality control, and processing. One of the key factors in achieving the full value of a PRM system is to handle such operations in a safe, integrated, and efficient manner to deliver high-quality seismic volumes for interpretation with rapid turnaround. Key aspects of the 4D processing sequence include robustness and optimal turnaround. Integration of the different operational phases of the LoFS project and integration of expertise between client and contractor play a key role in delivering clean, well-resolvable 4D signals and low residual 4D noise with NRMS as low as 5%. The high-quality data delivered by operations and processing are now routinely used in well planning and reservoir-management workflows.
A B S T R A C TWith the increasing use of permanently installed seismic installations, many of the issues in time-lapse seismic caused by the lack of repeatability can be reduced. However, a number of parameters still influence the degree of reliability of 4D seismic data. In this paper, the specific impact of seawater velocity variations on time-lapse repeatability is investigated in a synthetic study. A zero-lag time-lapse seabed experiment with no change in the subsurface but with velocity changes in the water column is simulated. The velocity model in the water column is constant for the baseline survey while the model for the repeat survey is heterogeneous, designed from sea salinity and temperature measurements in the West of Shetlands. The difference section shows up to 80% of residual amplitude, which highlights the poor repeatability. A new dynamic correction which removes the effect of seawater velocity variations specifically for permanent installations is developed. When applied to the synthetic data, it reduces the difference residual amplitude to about 3%. This technique shows substantial improvement in repeatability beyond conventional time-lapse cross-equalization. I N T R O D U C T I O NOne of the main issues encountered during 4D seismic experiments is the degree of repeatability between successive surveys. It often determines the degree of confidence in the interpretation of the time-lapse signature. Metrics have been developed to quantify that degree of confidence. The most used are the NRMS (which measures the normalized root mean square difference residual amplitude) and the predictability (which measures the correlation between two traces). Their equations are given in Appendix A. Over the past years, factors affecting repeatability have been studied extensively and solutions or guidelines have been laid out in order to minimize their effects. From the acquisition point of view, dedicated 4D surveys have proved to be more repeatable than the use of legacy data sets. Swanston et al. (2003) compared the repeatabilities obtained for legacy and 4D dedicated sur- * Now at: veys. In their examples, dedicated 4D experiments tended to achieve an NRMS value of around 30% to 40% while it is more likely to exceed 60% for 4D experiments using data not purposely acquired. Such experiments include the Alba Field (towed streamers and seabed receivers) where the NRMS is 89% (Hanson et al. 2003). Similarly, dedicated 4D processing helps to reduce the discrepancies between data sets. Smith et al. (2001) showed that careful 4D reprocessing reduced the difference residual amplitude from 74.6% to 41.3%. New 4D specific processing techniques have also been developed to remove non-production-related difference signals. Cross-equalization (Ross, Cunningham and Weber 1996), warping (Rickett and Lumley 2001), singular-value decomposition (Reid et al. 2005) and geostatistics (Lecerf and Coleou 2002) are some of the methods that are now used to improve the quality of the 4D signature. However, some parameters, often ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.