Quick Links


Acquisition footprint suppression on 3D land surveysGreen Open Access

Authors: N. Gulunay, N. Benjamin and M. Magesan
Journal name: First Break
Issue: Vol 24, No 2, February 2006
Language: English
Info: Article, PDF ( 835.21Kb )

Necati Gulunay, Nigel Benjamin, and Mag Magesan of CGG discuss the acquisition geometry footprint that occurs on most 3D land surveys and a robust method to attenuate it. Acquisition geometry footprint on 3D land surveys manifests itself as striping on time slices of stack volumes with inline and crossline periods that coincide with the source and receiver line intervals of the acquisition geometry. Hill et al. (1999) give a detailed description and analysis of acquisition footprint. DMO volumes are also known to exhibit such artefacts. Schleicher et al. (1989) and Ronen (1994) address DMO artefacts caused by irregularities in the fold pattern. These artefacts on stack volumes (or DMO volumes) are sometimes described as ‘hatching’ or ‘chatter’ and they hinder accurate interpretation. This phenomenon is similar to the periodic artefacts that are observed along the inline (CDP) direction of 2D lines where patterns are due to the fact that the offset distribution within CDPs shows a periodicity proportional to the ratio of the source interval divided by the receiver interval. In 2D surveys, shot and receiver geometry and field arrays can be combined via the ‘stack array’ principle (Ansty, 1986; Morse et al., 1989) to attenuate such artefacts. It is well known that such artefacts in 2D can be reduced by alias handling trace interpolation as the source of the problem is the steeply dipping (aliased) energy formed from use of sparse recording geometries. In 3D land surveys, owing to large source and receiver line intervals, and due to the quest for small source and receiver arrays, it is not possible to apply the ‘stack array’ principle. Any multichannel process, whether it is stack, DMO, or migration, is prone to such artefacts. As interpolation is difficult or costly other means to attenuate such artefacts are required. The first published observations known to us of acquisition footprint problems on 3D volumes and a deterministic solution for their attenuation were made by Meunier et al. (1992) and Baixas et al. (1993). Shortly afterwards Gulunay et al. (1994) suggested a data-adaptive frequency-wavenumber domain notch filtering method that worked well for data with generally flat events. Gulunay (1999, 2000) later extended the method to dipping data by observing that the artefact shapes are frequency-invariant but that their location varies with frequency and follows the dip of each dominant event. Working in the frequency domain is not the only tool: Drummond et al. (2000) suggested the use of time slices and wavenumber domain deterministic notch filtering on those slices. However, because their time slice based filtering method was not data-adaptive and noise patterns can vary with time, they also suggested the use of adaptive subtraction of the noise. Geometry-driven deterministic time slice filtering of 3D data for footprint attenuation was later proposed by Soubaras (2002). Karagul et al. (2004) showed interesting results on a data set with complex structures using the Soubaras approach. Most recently, Al-Bannagi et al. (2004) proposed time slice singular value decomposition (SVD) filtering where footprint attenuation and random noise attenuation are performed in one step by selecting certain singular values. Seeing such prior publications one might naturally wonder what the pros and cons of working in time or frequency slices are, as they are not necessarily the same even when all the frequencies are filtered in the frequency domain method. One might also wonder if data-adaptive methods like the ones by Gulunay (1999, 2000) can handle complex field patterns or whether we should design deterministic filters for each separate study. Leaving the first question to future studies, Gulunay et al. (2005) have developed a frequency slice wavenumber notch filtering method, named FKF3D, with extensive user controls and some important QC capabilities, and have tested it on various data sets from the Middle East. In this paper we give an introduction to the underlying principles of the method and show some results on Middle East field data.

Back to the article list