We have devised a new filtering technique for random and coherent noise attenuation in seismic data by applying empirical mode decomposition (EMD) on constant-frequency slices in the frequency-offset [Formula: see text] domain and removing the first intrinsic mode function. The motivation behind this development is to overcome the potential low performance of [Formula: see text] deconvolution for signal-to-noise enhancement when processing highly complex geologic sections, data acquired using irregular trace spacing, and/or data contaminated with steeply dipping coherent noise. The resulting [Formula: see text] EMD method is equivalent to an autoadaptive [Formula: see text] filter with a frequency-dependent, high-wavenumber cut filtering property. Removing both random and steeply dipping coherent noise in either prestack or stacked/migrated sections is useful and compares well with other noise-reduction methods, such as [Formula: see text] deconvolution, median filtering, and local singular value decomposition. In its simplest implementation, [Formula: see text] EMD is parameter-free and can be applied to entire data sets without user interaction.
Singular value decomposition ͑SVD͒ is a coherency-based technique that provides both signal enhancement and noise suppression. It has been implemented in a variety of seismic applications -mostly on a global scale. In this paper, we use SVD to improve the signal-to-noise ratio of unstacked and stacked seismic sections, but apply it locally to cope with coherent events that vary with both time and offset. The local SVD technique is compared with f-x deconvolution and median filtering on a set of synthetic and real-data sections. Local SVD is better than f-x deconvolution and median filtering in removing background noise, but it performs less well in enhancing weak events or events with conflicting dips. Combining f-x deconvolution or median filtering with local SVD overcomes the main weaknesses associated with each individual method and leads to the best results.
The Kullback information criterion KIC is a recently developed tool for statistical model selection [I]. KIC serves as an asymptotically unbiased estimator of a variant of the Kullback symmetric divergence, known also as J-divergence. In this paper a bias correction of the Kullback symmetric information criterion is derived for linear models. The conection is of particular use when the sample size is small or when the number of fitted parameters is of moderate to large fraction of the sample sire. For linear regression models, the corrected method called KICc is an exucrly unhiased estimator of a variant of the Kullback symmetric divergence between the true unknown model and the candidate fitted model. Furthermore KICc is found to provide better model order choice than any other asymptotically efficient methods in an application to autoregressive time series models. measures, it functions as a gauge of model disparity, which is arguably more sensitive than either of its individual component. Following the above reasoning, Cavanaugh [ I ] proposed the Kullback information criterion KIC as an asymptotically unbiased estimate of a variant (within a constant) of the J-divergence between the true unknown model and the fitted approximating model. Motivated by the above developments, we propose a bias corrected version of the KIC for linear regression models. The new criterion is shown to outperform classical criteria in a small sample autoregressive modeling. The remainder of this paper is organized as follows. In section 2 we present a short overview of Kullback's directed divergence, AIC, its corrected version AlCc and KIC. In section 3 we introduce the bias corrected version of KIC. Section 4 presents simulation results for autoregressive model selection. We end up by concluding remarks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.