Reduce processing time and increase confidence with data customization.
Published: 02/01/2014
Published: 02/01/2014
Effective time-lapse, or 4D, processing seeks to extract the 4D signature with minimal background noise to help understand reservoir behavior in the production phase and to optimize producer and injector well placement. Nonrepeatability of source and sensor positions in the presence of overburden heterogeneity limits our ability to repeat seismic data and to deliver 4D signals with low background noise levels. To compensate for imperfect receiver repositioning, wavefieldregularization techniques can enable wavefield matching on a grid common to both data vintages. Often there is a tradeoff between reducing the receiver-positioning mismatch and crossline interpolation error from wavefield spatial aliasing. Through the use of towed multisensor marine seismic streamers that record collocated pressure (P) and acceleration measurements (Y and Z) and the application of a matchingpursuit interpolation technique, a 3D dealiased, reconstructed, and deghosted pressure wavefield can be obtained on a densely sampled grid that permits wavefield matching to prior positions in a highly effective way. From analysis on field multisensor streamer data, implications for time-lapse seismic and repeatability are investigated.
The noise floor for marine time-lapse seismic acquisition is usually determined more by nonrepeatable recording of coherent wavefronts from one survey to the next and less by ambient noise. This can be because of variation in physical parameters such as water velocity or water-column thickness that affect the repeatability of multiples but also because of the inability to repeat the acquisition geometry with sufficient accuracy.
Usually, knowledge of where the sources and streamers were at a given shot instance is much more precise than the ability to control where they should have been. This implies that the operational ability to hold a predefined course for streamers and sources can be challenging even for systems with steerable devices (Brown and Paulsen, 2011). Normally, shot location is more easily repeatable than sensor location, even with steerable streamers. Four-dimensional processing therefore needs to include interpolation onto a common grid.
However, conventional data typically are sampled inadequately in the crossline direction, resulting in the leakage of aliased energy. It is now widely recognized that repeating source and receiver positions from the baseline survey is of utmost importance to minimize time-lapse background noise.
Landrø (1999) also finds that overburden heterogeneity has a significant impact on repeatability from analysis of a variogram of trace-to-trace differences against source-separation distance in a 3D VSP common-receiver gather. Landrø (1999) asserts that for a homogeneous earth, shot location should not matter, and recorded traces should be the same after allowing Kurt Eggenberger, Philip Christie, Dirk-Jan van Manen, and Massimiliano Vassallo, Schlumberger for different path lengths. The fact that there were differences between traces, that the differences were a function of shot separation, and that the variance spatially correlated with shallow overburden heterogeneity supported the inference that overburden heterogeneity limits the ability to repeat time-lapse seismic data and to deliver 4D signals with low background noise levels. Calvert (2005), Smit et al. (2005), Misaghi et al. (2007), and Naess et al. (2007) further discuss this issue in detail.
Consequently, matching source-receiver pairs for different seismic data vintages becomes critical. It can require oversampled acquisition and/or reliance on wavefield interpolation to yield wavefields recorded at corresponding locations between two or more surveys.
Robertsson et al. (2008) introduce the concept of a multimeasurement or multisensor streamer that would measure not only scalar pressure wavefields but also vector wavefields of particle motion such as velocity or acceleration. Based on those additional measurements, Özbek et al. (2010) outline the theory for a signal-processing technique called generalized matching pursuit (GMP) that can realize joint wavefield reconstruction and deghosting in a 3D sense by finding basis functions that model simultaneously the recorded pressure wavefield and as many gradients as might be available. The spatially continuous basis functions allow wavefield reconstruction at any point within the streamer aperture up to and beyond twice the corresponding pressure-only crossline Nyquist wavenumber.
The goal of this paper is to quantify the GMP-based wavefield-reconstruction fidelity on pre- and poststack data in different domains, using real data acquired during repeated passes in the North Sea. For quantification purposes, a witness (or benchmark) streamer recording the total pressure wavefield was employed as an actual measurement against which the wavefield-reconstruction quality was assessed. As a state-of-the-art, pressure-only reconstruction benchmark, the interpolation-by-matching-pursuit (IMAP) algorithm is used (Özdemir et al., 2008; Özbek et al., 2009) that also takes benefits of priors (Özbek et al., 2012). The time-domain repeatability metrics used were normalized root-meansquare error (NRMS) and predictability (PRED), as defined by Kragh and Christie (2002), and their frequency-domain equivalents. This procedure also helps to distinguish the question of fidelity of the reconstruction from its repeatability.