Using optimum filter theory as a starting point, we describe a method for the design of practical multi-trace seismic data processing systems. We assume the inputs to be the superposition of signal, coherent noise, and incoherent noise. The signal and coherent noise moveouts are described statistically by their probability densities. Our approach is to split the system into two stages. The first stage achieves optimum noise suppression but distorts the signal. The signal distortion is reduced in the second stage by an optimum finite memory inverse filter.The system that is obtained using our method of design depends upon the form of the probability density functions. We show two examples, ghost suppression and velocity filtering.In ghost suppression we choose a model with moveouts known exactly, which corresponds to delta functions for the probability densities.In velocity filtering the signal and coherent noise moveouts are equally probable within non-overlapping ranges.The resulting system in each case is both simple and effective. In ghost suppression a simple shift and subtract cancels the coherent noise. The signal distortion is reduced by an inverse filter. The velocity filter system consists of differentiated moving averages applied to each trace, followed by a go' phase shift and a low pass filter.
DESIGN OF SUB-OPTIMUM FILTER SYSTEMS FOR MULTI-TRACE SEISMIC DATA PROCESSING
I. IlztroductionThe use of Wiener optimum filter theory in multi-trace seismic data processing was proposed by Robinson (1954) and applied by Burg (1962). In this theory economic constraints are not imposed and the resulting filters are costly to construct and apply. Our purpose is to describe a method which goes beyond the optimum theory, producing practical filters whose performance is excellent. We begin by selecting a basic system which is optimum for high noise levels. For actual data this system effectively suppresses the noise but badly distorts the signal. We then apply an inverse filter to reduce the signal distortion.