A dual-sensor towed streamer records the pressure and vertical component of particle motion associated with the incident wavefield that may be used to separate the wavefield into its upand downgoing parts. This procedure requires information about the water properties (wave-propagation velocity and density) and is robust in the presence of errors in the estimation of these quantities of the magnitude likely to be encountered. In practice, the particle motion data recorded by current towed marine streamers encounter very strong mechanical noise such that, for the lowest frequencies, the wavefield separation must be approximated by deconvolving the ghost function from the pressure data. This procedure requires information about the streamer depth and is robust to small depth errors over the frequency range for which it is required for dual-sensor streamer processing, but it is much more sensitive if applied over the bandwidth necessary to deghost pressure data acquired at a conventional streamer depth. The signal-to-noise ratio can be further enhanced by recombining the up-and downgoing pressure fields at the sea surface, which has the effect of applying a ghostlike filter to noise that is recorded by only one of the two sensors. In practical marine acquisition scenarios, spatial sampling is often insufficient to yield an accurate result, especially in the crossline direction. If each streamer is processed independently assuming that the wavefield propagation is purely inline, significant errors can be introduced. For arrivals with high emergent angles, errors may also be introduced even if the wavefield propagation actually is purely inline due to incorrect treatment of spatially aliased energy. However, these effects are almost entirely confined to very shallow events. They can be mitigated by using independently derived information about the crossline propagation angle and, for data comprising predominantly forward scattered energy, appropriate application of linear moveout.
In the marine seismic industry, the size of the recorded and processed seismic data is continuously increasing and tends to become very large. Hence, applying compression algorithms specifically designed for seismic data at an early stage of the seismic processing sequence helps to save cost on storage and data transfer. Dictionary learning methods have been shown to provide state‐of‐the‐art results for seismic data compression. These methods capture similar events from the seismic data and store them in a dictionary of atoms that can be used to represent the data in a sparse manner. However, as with conventional compression algorithms, these methods still require the data to be decompressed before a processing or imaging step is carried out. Parabolic dictionary learning is a dictionary learning method where the learned atoms follow a parabolic travel time move out and are characterized by kinematic parameters such as the slope and the curvature. In this paper, we present a novel method where such kinematic parameters are used to allow the dual‐sensor (or two‐components) wavefield separation processing step directly in the dictionary learning compressed domain for 2D seismic data. Based on a synthetic seismic data set, we demonstrate that our method achieves similar results as an industry‐standard FK‐based method for wavefield separation, with the advantage of being robust to spatial aliasing without the need for data preconditioning such as interpolation and reaching a compression rate around 13. Using a field data set of marine seismic acquisition, we observe insignificant differences on a 2D stacked seismic section between the two methods, whereas reaching a compression ratio higher than 15 when our method is used. Such a method could allow full bandwidth data transfer from vessels to onshore processing centres, where the compressed data could be used to reconstruct not only the recorded data sets, but also the up‐ and down‐going parts of the wavefield.
discuss the lessons learnt from the roll out of dual-sensor technology in PGS' fleet.T he launch of the dual-sensor towed streamer technology in 2007 is seen by many in the industry as the most important milestone in marine seismic technology in the last decade. The introduction of the technology triggered a significant interest and demand for broader bandwidth seismic data and increased the industry-wide awareness of the geophysical benefits of such broadband data for both frontier exploration and production monitoring in mature basins. It also resulted in the rapid development of new acquisition and processing technology, both concerning the source and receiver side, as well as changes to seismic vessel design and equipment. The geophysical benefits of broadband data and the availability of up-and down-going wavefields as part of the dual-sensor deghosting methodology are now routinely exploited throughout the entire seismic value chain, including seismic imaging and reservoir characterization.After the first 2D dual-sensor survey in 2007, which was quickly followed by the first 3D acquisition commencing on New Year's Eve 2008, PGS has steadily converted its seismic fleet from hydrophone-only to dual-sensor streamers. The pace of the technology roll-out has been largely driven by the life-cycle of existing streamer inventory and the equipment needs for newly launched seismic vessels as part of an ongoing fleet renewal process. The fleet-wide roll-out of dual-sensor technology will finally be completed in the 4 th quarter of 2015 with the upgrade of the last Ramform vessel. Given the scale and complexity of replacing and industrializing a complete acquisition platform, there have naturally been significant lessons, some of which we will be sharing in this article. We will also discuss some of the acquisition and processing technologies that have been developed and/ or adapted in order to fully utilise this new marine seismic technology platform. Robust ghost removal for deeper streamer towTo be successful in today's challenging E&P environment, petroleum geoscientists must detect and properly image increasingly complex reservoirs by resolving the fine detail of ever smaller hydrocarbon accumulations. High quality seismic data plays a key role in this task and is of great significance in the effort to reduce overall E&P risk. The demands placed on modern seismic data are multifold, but critically the data needs to enable the identification and delineation of leads/prospects based on pre-stack seismic and to quantify key reservoir properties to increase the probability of successfully separating lithology-fluid facies. All of these goals must be achieved in 3D using all the dimensions of the seismic data, mainly pre-stack, and later on 4D (time-lapse). It has been well understood for some time that data richer in both low and high frequency information would form the optimum input for improved reservoir delineation and high-resolution imaging and that improvements in the signal-to-noise ratio of the recorded data ...
Summary Increasing and decreasing natural gas pipeline inventory (" packing" and "drafting" ) are examined mathematically. Any line segment's unsteady-state packing or drafting behavior depends on only two dimensionless parameters, packing or drafting behavior depends on only two dimensionless parameters, ( ) and ( ). The influence of ( ) is small, so that for any value of ( ) the behavior of all pipelines can be represented on a single plot; four such plots are shown for four different boundary conditions. Introduction Natural gas dispatchers use increase and decrease of the stored inventory of gas in their pipes as one method of matching time-varying demands with supplies, which generally have less time variation. In pipeline terminology, increasing the inventory (and hence the pressure) is called "line packing," while decreasing it is called "line drafting." This paper examines the limits of this procedure, asking both how much gas can be added to or subtracted from the inventory in a given pipeline segment, and how rapidly this can be accomplished. How Rapidly Can the Inventory Be Depleted or Restored? Although the questions of how large the inventory is and how much can be taken from it for any change in steady-state conditions can be answered very simply, the computation of how rapidly this can be accomplished requires a set of coupled partial differential equations and a numerical solution on a computer. Fortunately, as shown here, the nondimensional results can be summarized in ways that are fairly easy to use. Mathematical Theory It has been known since at least 1951 that the mathematical description of the unsteady-state flow of any gas in a long pipeline is governed by a material-balance equation, ............................(1) and a momentum-balance equation, ............................(2) The mass-balance and momentum equations, expressed with pressure, and mass flow rate, as dependent variables and length, and time, as independent variables and assuming that 's are much smaller than 's and ( / )'s, are ............................(3) and ........................... (4) respectively. The justification for dropping the terms as being much smaller than the others is that changes very slowly with changes in and other variables. JPT p. 655
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.