200 likes | 208 Views
This tutorial provides example workflows for complex seismic processing techniques in Claritas, aimed at experienced users. Learn how to restore Claritas archives and run production sequences from Job00 to Job06.
E N D
Introduction This Advance Marine tutorial is provided as a production workflow from Job00 (reformat) to Job06 (High Density Velocity Analysis) which can be run sequentially on the supplied raw data. However the main function of the tutorial is to deliver example workflows for complex seismic processing techniques, as optimised Claritas processing flows. The testing flows for each stage are not included within the tutorial. The Advanced tutorial is aimed at experienced Claritas users who have successfully worked through the 2D Marine tutorial and have a good working knowledge of the different Claritas applications.
Introduction (countinued) The following additional job flows are provided:- • For all processing stages a stack job. • Shot and stack comparison job flows. For all processing flows supplied the required support files are present in the project. All processing flows have suitable production QC’s built into the production flow.
Input Dataset Data supplied is part of line TL-001 from the Taranaki Basin offshore New Zealand. The portion of line selected runs from the shallow water (210ms) to Deep Water( 1615ms) across a shelf break. Shotpoint Interval : 50 Metres Group Interval : 12.5 Metres Number of Channels : 240 Cable Length : 3000 Metres Data Length : 5500ms Sample Rate : 4ms
How to Restore a Claritas Archive file Claritas tutorials are generally supplied as self contained project archives, created by the ARCHIVE option under the launchers PROJECT tab. To RESTORE a Claritas Archive (*.ca) select the RESTORE option on the project tab. The following are the key parameters from the form:_ The Project Parent Directory parameter defines where you wish to output the restored project to. The Archive filename…to unpack parameter defines where to read the *.ca file from. The Project Name parameter allows you to define the name to save the project as. The Data Parent directory parameter allows you to optionally define a disk area for the seismic data only
Job 00 - Reformat Initial job flow for the supplied production processing sequence reformats from standard SegY format to Claritas internal format, the following additional processing is applied. • Application of T squared Spherical Divergence correction. • A 2D Marine Geometry. • Minimum phase conversion filter which was derived from the input data, a windowed portion of the data was selected in SV/XVIEW and saved out to a claritas dataset. This dataset was then read into the WAVELET application stacked and filter derived from the stacked trace.
Job 01 - SRME The SRME model building section of the processing flow has been split into 3 parts as below:- job01a_srme_prep_01.job – Data preparation for SRME modelling. job01b_srme_01.job - SRME modelling process. job01c_model_prep_01.job – Model preparation for adaptive subtraction.
Job 01a – SRME Preparation The aim of the SRME preparation work flow is :- • To match the Shotpoint and Group interval prior to the modelling stage. SP Interval interpolated to 25 metre, Group interval decimated to 25 metres. • Mute off any water column noise present. • Extrapolate the data to as close to zero offset as possible with regard to the nominal near offset. • Account for any required static corrections. Apply 100ms SOD correction and 13ms SSD correction. • Apply any required trace/shot edits.
Job 01b – SRME The Claritas SRME module controls data input and manages passing the data on to additional processes in the data flow, a Claritas output module such as DISCWRITE is required in the flow. Input is via the SHOTFILE and LAST_MULT parameters in the SRME module. These will generally be the output from the job01a_srme_prep_01.job. N.B. Future releases of Claritas (V5.6 on)will have a newly available SRME module (SRME2) which will allow user control of Aperture and is MPI capable.
Job 01c – MODEL Preparation The aim of the MODEL preparation work flow is :- • To output the model so that it matches the original input dataset. Therefore shots are decimated back to 50 metre SP Int, channels are interpolated up to 12.5 metre group interval. • Mute off any noise present above first WB Multiple. • Remove extrapolated channels and output at original offset range. • For this example sort data to CHANNEL/SHOTID order for common channel adaptive subtraction.
Job 02 – Adaptive Subtraction The aim of the Adaptive Subtraction work flow is :- • Output from job00_reformat is read using the DISCSORT module to sort to CHANNEL/SHOTID order prior to Adaptive subtraction. • Required SOD and SSD static corrections are applied. • REREAD module used to create PSEUDOTRACE containing the SRME Model. • Initial Adaptive subtraction performed by an ungated WANGSUBT which adapts the model both temporally and spatially. • Further adpative subtraction performed by 100ms and 300ms MONKSUBT.
Job 03 – TPDBS and Swell Noise Attenuation The data is reordered on input from CHANNEL/SHOTID to SHOTID/CHANNEL using the DISCSORT module :- • The OFFREG module is employed to interpolate the data from 12.5 to 6.25 metre group interval to provide improved sampling for the Tau-P domain transform. • The data is transformed into the Tau-P domain using the TAUP module. The inverse transform is performed by the TAUPINV module. • FDFILT module is used to apply a 7-10-80-90 Hz bandpass filter which attenuates the swell noise.
Job 03 – TPDBS and SWAT (continued) • For data with a waterdepth less than 300ms a 480 ms operator and 48ms Gap Deconvolution is applied in the Tau-P domain to attenuate multiples. • To attenuate direct arrival energy and other linear noise a Tau-P Domain mute is applied. • Data is inverse transformed back to XT domain for ongoing processing. • To de-alias the shot records the data is interpolated to 3.125m group interval before application of 0.115 K filter. The data is then desampled back to 12.5 metre group interval 240 fold shot gathers.
Job 03a – TPDBS and SWAT using DUSWELL The processing flow job03a_tpdbs_swat_alternative.job is supplied showcasing an alternative swell noise attenuation process if this has been purchased by your site :- • As an alternative to the Tau-P domain filter applied in the standard flow, this demonstrates a workflow for the wavelet domain SWAT process DUSWELL. • DUSWELL analyses the data within discrete windows or boxes, any that exceeds the defined thresholds is removed and the frequency reinterpolated from surrounding data.
Job 03a – TPDBS and SWAT using DUSWELL (continued) • By creating a PSEUDOTRACE which is a copy of the input dataset we can generate a processing flow which utilises aggressive thresholds for attenuation of Swell noise removal whilst being able to preserve near surface data. • The PSEUDOMATH module is used to merge the PSEUDOTRACE containing unfiltered data with the data after DUSWELL along the user specified mute time as written to the headers by SMUTE. • DELHDR is used to remove the PSEUDOTRACE after merging. • Remainder of the flow is the same as the standard job.
Job 04 – Parabolic Radon Demultiple Job04 applies standard parabolic radon demultiple to attenuate remnant multiples NMO corrected CDP gathers. • PRT Demult is applied from 2xWB -30ms. • Transform range used is from -700 to +700ms. • Multiple is considered to be from 60ms to 696 ms of moveout. • Noise or multiples are subtracted.
Job 05 – Pre-Stack Time Migration Job05 uses the Claritas KPRET2D module to pre-stack time migrate the data. • Input is data after PRT-Demultiple. • Uses a smoothed RMS velocity field in Claritas NMO format. • Output is NMO corrected PRESTM Gathers. • Maximum migration angle is 65 degrees and an aperture of 3000metres. • KPRET2D uses an asymmetric migration operator with travel times calculated from shot an receiver locations and averaged at the CDP.
Job06 – High Density Velocity Analysis Job06 generates a high density NMO velocity field, with the aid of heavy preconditioning of the data the claritas NMOPICK module is able in conjunction with the tools in the ISOVELS application to deliver an NMO velocity field which represents the geology and flattens the gathers effectively. • Input is data after PreSTM. • PRT Demultiple and FK filters are used to remove remnant multiples and linear noise to provide clean Semblance gathers to the NMO PICK module. • Output is NMO velocity field.
Job 06 – High Density Velocity Analysis (continued) The creation of the NMO field by the NMOPICK module is only part of the story. The ISOVELS application needs to be used to condition this velocity field before it is used to NMO correct the data. • Remove high & low RMS vels (range 1480-6000m/s) • Remove outlying interval velocities (range 1480-6000m/s). This is an iterative approach and will require a number of passes until vels outside this range are removed. • Smooth the data spatially only using a 41 point low pass filter.