We propose an efficient method for detecting and extracting fault surfaces in 3D-seismic volumes. The seismic data are transformed into a volume of local-fault-extraction ͑LFE͒ estimates that represents the likelihood that a given point lies on a fault surface. We partition the fault surfaces into relatively small linear portions, which are identified by analyzing tilted and rotated subvolumes throughout the region of interest. Directional filtering and thresholding further enhance the seismic discontinuities that are attributable to fault surfaces. Subsequently, the volume of LFE estimates is skeletonized, and individual fault surfaces are extracted and labeled in the order of decreasing size. The ultimate result obtained by the proposed procedure provides a visual and semantic representation of a set of well-defined, cleanly separated, one-pixelthick, labeled fault surfaces that is readily usable for seismic interpretation.
Abstract-Compression is a relatively new introduced technique for seismic data operations. The main drive behind the use of data compression in seismic data is the very large size of seismic data acquired. Some of the most recent acquired marine seismic data sets exceed 10 Tbytes, and in fact there are currently seismic surveys planned with a volume of around 120 Tbytes. Thus, the need to compress these very large seismic data files is imperative. Nevertheless, seismic data are quite different from the typical images used in image processing and multimedia applications. Some of their major differences are the data dynamic range exceeding 100 dB in theory, very often it is data with extensive oscillatory nature, the and directions represent different physical meaning, and there is significant amount of coherent noise which is often present in seismic data. Up to now some of the algorithms used for seismic data compression were based on some form of wavelet or local cosine transform, while using a uniform or quasiuniform quantization scheme and they finally employ a Huffman coding scheme. Using this family of compression algorithms we achieve compression results which are acceptable to geophysicists, only at low to moderate compression ratios. For higher compression ratios or higher decibel quality, significant compression artifacts are introduced in the reconstructed images, even with high-dimensional transforms. The objective of this paper is to achieve higher compression ratio, than achieved with the wavelet/uniform quantization/Huffman coding family of compression schemes, with a comparable level of residual noise. The goal is to achieve above 40 dB in the decompressed seismic data sets. Several established compression algorithms are reviewed, and some new compression algorithms are introduced. All of these compression techniques are applied to a good representation of seismic data sets, and their results are documented in this paper. One of the conclusions is that adaptive multiscale local cosine transform with different windows sizes performs well on all the seismic data sets and outperforms the other methods from the SNR point of view. All the described methods cover wide range of different data sets. Each data set will have his own best performed method chosen from this collection. The results were performed on four different seismic data sets. Special emphasis was given to achieve faster processing speed which is another critical issue that is examined in the paper. Some of these algorithms are also suitable for multimedia type compression.
We present an algorithm for multichannel blind deconvolution of seismic signals, which exploits lateral continuity of earth layers by dynamic programming approach. We assume that reflectors in consecutive channels, related to distinct layers, form continuous paths across channels. We introduce a quality measure for evaluating the quality of a continuous path, and iteratively apply dynamic programming to find the best continuous paths. The improved performance of the proposed algorithm and its robustness to noise, compared to a competitive algorithm, are demonstrated using simulated and real seismic data examples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.