We have developed an improved Levenburg‐Marquart technique to rapidly invert Bouguer gravity data for a 3-D density distribution as a source of the observed field. This technique is designed to replace tedious forward modeling with an automatic solver that determines density models constrained by geologic information supplied by the user. Where such information is not available, objective models are generated. The technique estimates the density distribution within the source volume using a least‐squares inverse solution that is obtained iteratively by singular value decomposition using orthogonal decomposition of matrices with sequential Householder transformations. The source volume is subdivided into a series of right rectangular prisms of specified size but of unknown density. This discretization allows the construction of a system of linear equations relating the observed gravity field to the unknown density distribution. Convergence of the solution to the system is tightly controlled by a damping parameter which may be varied at each iteration. The associated algorithm generates statistical measures of solution quality not available with most forward methods. Along with the ability to handle large data sets within reasonable time constraints, the advantages of this approach are: (1) the ease with which pre‐existing geological information can be included to constrain the solution, (2) its minimization of subjective user input, (3) the avoidance of difficulties encountered during wavenumber domain transformations, and (4) the objective nature of the solution. Application to a gravity data set from Hamilton County, Indiana, has yielded a geologically reasonable result that agrees with published models derived from interpretation of gravity, magnetic, seismic, and drilling data.
The goal of a seismic survey is to illuminate subsurface geologic formations that may hold hydrocarbon accumulations. Conventional seismic survey design relies on the assumption that uniform midpoint coverage will lead to uniform illumination in the subsurface as long as each midpoint is hit by a sufficient range of offsets. In areas of complex velocity structure, severe wavefield distortions lead to irregular subsurface illumination patterns, even if surface midpoint maps show a uniform distribution. A more appropriate approach is to design seismic surveys to ensure illumination of key subsurface horizons.The difference between midpoint coverages and subsurface illumination patterns is particularly large in saltprone areas (Muerdter et al., 1997). Due to severe wave distortion through complex, high-velocity salt bodies, conventional design methods that result in relatively uniform surface coverage ( Figure 1) generate uneven amplitudes and shadow zones on subsalt horizons, an effect that is shown clearly by ray-trace modeling of an entire seismic survey ( Figure 2). Methodology.We begin by building a model of the known geology. Typically, this model is based on the velocity model used for migration or on previously available seismic interpretations and includes any horizons of particular exploration interest. We can then predict the illumination and amplitude distributions that should be achieved on any horizon from a given survey by full-offset ray tracing of the survey coordinates (either planned coordinates or actual coordinates from navigation data obtained during the collection of a survey). If the modeled illumination and amplitude maps are to be compared with similar data extracted from processed seismic, it is important to ray trace only those shots and receivers that were used in the processing (i.e., the trace coordinates to be ray traced should be extracted during the processing flow).Ray-tracing results must next be reduced in a manner consistent with the seismic processing used for the survey. In particular, the subsurface bins into which modeled hits or amplitudes will be accumulated need to be equivalent to the subsurface bins used in processing. Similarly, the offset range of the ray-traced results should be limited to the range used in processing. Also, it is critical that multiple ray-tracing hits within each offset bin be handled as they will be in seismic processing. Finally, any gain treatments that will be applied during processing need also to be applied to the raytracing results.Fresnel-zone smoothing effects may be introduced to improve the match between ray-tracing results and seismic data. This may be accomplished by spreading each hit over adjacent subsurface bins that are within the first Fresnel zone of the reflection point (Schneider and Winbow, 1999). Fresnelzone smoothing often has the side effect of masking the acquisition overprint. Proper reduction of the ray tracing results leads to amplitude maps that approximate the result of prestack depth migration. Iterative ray tracing ...
We operated an eleven-station network of digital instruments in the Wabash Valley region from November 1995 through June 1996 in order to investigate seismic activity in the Wabash Valley seismic zone. One station of the network was a ten-element, three-component, high-frequency, phased array. The array was primarily responsible for lowering the detection threshold by approximately 1.5 magnitude units below that achieved previously, to magnitudes of 1.2 to 1.5. We observe a significant excess of events in the region from that expected by extrapolation of the historical and earlier instrumental catalogs. We show that the excess is related to a cluster of earthquakes near New Harmony, Indiana. We argue that their shallow depth, similarity ofwaveform characteristics, and proximity to producing oil wells and underground coal mines suggest that these small-magnitude events may be artificially induced. We find that discarding the events from this cluster leads to seismicity rates more consistent with previous data.
We demonstrate that a workflow combining emergent time-lapse full-waveform inversion (FWI) and machine learning technologies can address the demand for faster time-lapse processing and analysis. During the first stage of our proposed workflow, we invert long-wavelength velocity changes using a tomographically enhanced version of multiparameter simultaneous reflection FWI with model-difference regularization. Short-wavelength changes are inverted during the second stage of the workflow by a specialized high-resolution image-difference tomography algorithm using a neural network. We discuss application areas for each component of the workflow and show the results of a West Africa case study.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.