260 likes | 270 Views
ADVANCED MARINE TUTORIAL. Introduction. The advanced marine tutorial is provided as a production workflow from Job00 (reformat) to Job09 (Broadband finalisation using INVERSEQ and Spectral Whitening) which can be run sequentially on the supplied raw data.
E N D
Introduction The advanced marine tutorial is provided as a production workflow from Job00 (reformat) to Job09 (Broadband finalisation using INVERSEQ and Spectral Whitening) which can be run sequentially on the supplied raw data. The main function of the tutorial is to deliver example workflows for complex seismic processing techniques, as optimised GLOBE Claritas processing flows. The testing flows for each stage are not included within the tutorial. The advanced tutorial is aimed at experienced GLOBE Claritas users who have successfully worked through the 2D marine tutorial and have a good working knowledge of the different GLOBE Claritas applications.
Introduction(continued) The job flows are provided for all processing stages and the required support files for these flows are all present in the project. All processing stages have suitable production QC’s built into the production flow. Stack jobs for each processing stage are included in the project. Comparisons of Shot and Stack datasets can be reviewed using the SEISVIEW application
Input Dataset Data supplied is part of line TL-001 from the Taranaki Basin offshore New Zealand. The portion of line selected runs from the shallow water (210ms) to Deep Water(1615ms) across a shelf break. Shotpoint Interval : 50 Metres Data Length : 5500ms Group Interval : 12.5 Metres Sample Rate : 4ms Number of Channels : 240 Source Depth : 9 Metres Cable Length : 3000 Metres Receiver Depth : 10 Metres
How to Restore a GLOBE Claritas Archive file GLOBE Claritas tutorials are generally supplied as self contained project archives, created by the Archive option under the Projects tab on the launcher. To restore a GLOBE Claritas archive (*.ca) select the Restore option on the Projects tab. The following are the key parameters from the form:- The Input archive filename parameter defines where to read the *.ca file from. The Parent directory for projects parameter defines where you wish to output the restored project to. The Project name parameter allows you to define the name to save the project as. The Data directory parameter allows you to optionally define a disk area for the seismic data only.
Job 00 - Reformat The initial job flow for the supplied production processing sequence reformats from standard SegY format to GLOBE Claritas internal format, and the following additional processing is applied. • Application of T squared spherical divergence correction. • Minimum phase conversion filter which was derived from the input data : A windowed portion of the data was selected in SV/XVIEW and saved out to a GLOBE Claritas dataset. This dataset was then read into the WAVELET application, stacked and the filter derived from the stacked trace. • A simple 2D marine geometry.
Job 01 - SRME The SRME model building section of the processing flow has been split into 3 parts as below:- job01a_srme_prep.job – Data preparation for SRME modelling. job01b_srme.job - SRME modelling process. job01c_model_prep.job – Model preparation for adaptive subtraction.
Job 01a – SRME preparation The aim of the SRME preparation work flow is :- • To match the shotpoint and group interval prior to the modelling stage. That is interpolate the SP interval from 50 metres to 25 metres, and decimate the group interval from 12.5 metres to 25 metres. • Extrapolate the data to as close to zero offset as possible, with regard to the nominal near offset. • Mute off any water column noise present. • Account for any required static corrections. Apply 100ms SOD correction and 13.5ms SSD correction. • Apply any required trace/shot edits.
Job 01b – SRME As of GLOBE Claritas V5.6, the SRME module has been superseded by SRME2D. The implementation of the later module is based on the concept of a Multiple Contribution Gather (MCG). The SRME2D module is a standalone module, i.e. no other process modules can be employed in the processing flow. SRME2D allows user control of aperture and is MPI capable.
Job 01c – MODEL preparation The aim of the MODEL preparation work flow is :- • To output the model so that it matches the original input dataset. Therefore shots are decimated back to 50 metre SP interval, channels are interpolated up to 12.5 metre group interval. • Remove extrapolated channels and output at original offset range. • Mute off any noise present above first WB Multiple.
Job 02 – Adaptive Subtraction The aim of the Adaptive Subtraction work flow is :- • Output from job00_reformat is read using SEISREAD • Required SOD and SSD static corrections are applied. • REREAD module used to create PSEUDOTRACE containing the SRME Model. • Create bandlimited copies of the raw SRME model for subtracting the multiples. • Frequency dependant adaptive subtraction performed by 1000ms to 200ms gated MONKSUBT. • After testing, if you do not obtain a good result from the adaptive subtraction, you could try the adaptive subtraction in a different domain e.g. convert both the data and model to common receiver gathers.
Job 03 – TPDBS and Swell Noise Attenuation Output from job02_adaptsub is read using SEISREAD :- • The OFFREG module is employed to interpolate the data from 12.5 to 6.25 metre group interval to provide improved sampling for the Tau-P domain transform. • The data is transformed into the Tau-P domain using the TAUP module. The inverse transform is performed by the TAUPINV module. • FDFILT module is used to apply a 7-10-80-90 Hz bandpass filter which attenuates the swell noise. • A 480ms operator and 48ms gap deconvolution is applied in the Tau-P domain to attenuate multiples.
Job 03 – TPDBS and SWAT(continued) • To attenuate direct arrival energy and other linear noise a Tau-P domain mute is applied. • Data is inverse transformed back to XT domain for on-going processing. • To de-alias the shot records a 0.45 K filter is applied to the data. • The SP interval is then interpolated to 25 metres and the group interval decimated to 25 metres, producing 120 fold shot gathers.
Job 03a – TPDBS and SWAT using DUSWELL The processing flow job03a_tpdbs_swat_alternative.job is supplied showcasing an alternative swell noise attenuation process:- • As an alternative to the Tau-P domain filter applied in the standard flow, this demonstrates a workflow for the wavelet domain SWAT process DUSWELL. • DUSWELL analyses the data within discrete windows or boxes, any that exceeds the defined thresholds is removed and the frequency re-interpolated from surrounding data.
Job 03a – TPDBS and SWAT using DUSWELL (continued) • By creating a PSEUDOTRACE which is a copy of the input dataset we can generate a processing flow which utilises aggressive thresholds for attenuation of swell noise removal whilst being able to preserve near surface data. • The PSEUDOMATH module is used to merge the PSEUDOTRACE containing unfiltered data with the data after DUSWELL along the user specified mute time as written to the headers by SMUTE. • DELHDR is used to remove the PSEUDOTRACE after merging. • Remainder of the flow is the same as the standard job.
Job 04 – Receiver Deghosting Job04 applies receiver deghosting in the shot domain using the GLOBE Claritas OCTAVE application. • For receiver deghosting OCTAVE is applied to common shot gathers. • The transform range used is from -0.2 to 0.99 • Water velocity of 1500 m/s • Nominal Receiver depth of 10 metres.
Job 05 – Source Deghosting The Source deghost comprises of three job flows :- job05_source_deghost-prep.job – Data preparation for source deghosting. job05_source_deghost.job - Source deghosting applied. job05_source_deghost_reorder.job – Reorder data back to shot domain
Job 05 – Source Deghosting Job05_source_deghost_prep.job applies required pre-processing for the source deghosting solution. • Reorders data from the shot domain to the common receiver domain. • Pads common receiver gathers to constant 120 fold. • Applies header manipulation to update shotid/channel and sort keys as required for OCTAVE. • Applies a suitable geometry to the receiver gathers so that SOURCE XY’s are constant for each gather. • Interpolates from 120 fold to 240 fold to eliminate aliasing issues that would adversely affect OCTAVE.
Job 05 – Source Deghosting Job05_source_dghost.job applies source deghosting in the receiver domain using the Claritas OCTAVE application. • For source deghosting OCTAVE is applied to common receiver gathers. • Transform range used is from -0.2 to 0.99 • Water velocity of 1500 m/s • Nominal Receiver depth of 9 metres.
Job 05 – Source Deghosting Job05_source_deghost_reorder.job applies required post processing to the OCTAVE source deghost output data. • Applies Offset regularisation back onto the original acquired offset distribution. • Sorts to common shot gather and removes padded traces. • Applies header manipulation to update shotid/channel and sort keys to original acquired values. • Reapplies the 2D Marine Geometry for 25m SP int and 25m Group Int.
Job 06 – Parabolic Radon Demultiple Job06 applies standard parabolic radon demultiple to attenuate remnant multiples from NMO corrected CDP gathers. • PRT_DEMULT is applied from 2xWB -30ms. • Transform range used is from -700ms to +700ms. • Multiple is considered to be from 60ms to 696ms of moveout. • Noise or multiples are subtracted.
Job 07 – Pre-Stack Time Migration Job07 uses the GLOBE Claritas IMAGE_K2T module to pre-stack time migrate the data. • Input is data after PRT_DEMULT. • Uses a smoothed RMS velocity field in GLOBE Claritas NMO format. • Output is NMO corrected PreSTM gathers. • Maximum migration angle is 45 degrees and an aperture of 3000 metres is used. • IMAGE_K2T uses an asymmetric migration operator with travel times calculated from shot and receiver locations and averaged at the CDP.
Job08 – High Density Velocity Analysis Job08 generates a high density NMO velocity field. With the aid of heavy preconditioning of the data the GLOBE Claritas NMOPICK module is able, in conjunction with the tools in the ISOVELS application, to deliver an NMO velocity field which represents the geology and flattens the gathers effectively. • Input is data after PreSTM. • PRT_DEMULT and FK filters are used to remove remnant multiples and linear noise to provide clean Semblance gathers to the NMOPICK module. • Output is NMO velocity field.
Job 08 – High Density Velocity Analysis(continued) The creation of the NMO field by the NMOPICK module is only part of the story. The ISOVELS application needs to be used to condition this velocity field before it is used to NMO correct the data. • Remove high & low RMS velocities (range 1480-6000m/s) [Utils-Delete bad Vrms] • Remove outlying interval velocities (range 1480-6000m/s). [Utils-Delete bad Vint] This is an iterative approach and will require a number of passes until velocities outside this range are removed. • Smooth the data spatially, only using a 41 point low pass filter. [‘Smooth’ button – Smoothing method ‘Filter’ – Filter length ‘41’]
Job09 – Broadband Job09 reads the output from the Pre-Stack Time Migration process and applies a suitable InverseQ filter in phase and amplitude, along with user defined spectral whitening coefficients to finalise the broadband processing flow that began with the source and receiver deghosting flows. • Apply Inverse Q factor of 300 performing an amplitude and phase correction of the data pre-stack. • Spectral whitening coefficients applied post stack with the aim of producing a broad flat amplitude spectra.