SlideShare uma empresa Scribd logo
1 de 77
Baixar para ler offline
OIL AND NATURAL GAS LIMITED
Project Report on
Interpretation of
Geological and Geophysical data
In Schlumberger’s Petrel Software
Submitted TO:- Submitted BY:-
Mr. S.K.DAS Sumit Kumar Bhattacharya
Executive Director-HOI M.Sc.(tech.) Geophysics, Final Year
GEOPIC Banaras Hindu University,Varanasi
ONGC , Dehradun
Umang Sundriyal
M.Sc.-tech. (Applied Geophysics),Final Year,
Indian School of Mines, Dhanbad.
Samiur Rahman Khan
M.Sc. Petroleum Geosciences, Final Year,
Banaras Hindu University,Varanasi.
Mohammad Sarim
M.Sc. Petroleum Geosciences, Final Year,
Banaras Hindu University,Varanasi.
CERTIFICATE
This is to certify that the summer training report entitled “Interpretation of Geological
and Geophysical data in Schlumberger‟s Petrel Software” submitted to GEOPIC ,
ONGC, Dehradun is a record of bonafied work carried out by Mr. Sumit Kumar
Bhattacharya, final year student of M.sc.(tech.) Geophysics, Department of
Geophysics, Banaras Hindu University, Varanasi, Mr. Umang Sundriyal, final year
student of M.sc.-tech. (Applied Geophysics), Indian School of Mines, Dhanbad, Mr.
Samiur Rahman Khan, final year student of M.sc.Petroleum Geosciences, Department
of Geology, Banaras Hindu University and Mr. Mohmmad Sarim, final year student of
M.sc.Petroleum Geosciences, Department of Geology, Banaras Hindu University,
Varanasi, U.P, under my supervision.
Supervisor :-
Mr. S.K.DAS
Executive Director
HOI, GEOPIC
ONGC , Dehradun
ACKNOWLEDGEMENT
We are thankful to ONGC for providing us with the opportunity for our Industrial Training in the
academic year 2010-2011.
We are sincerely thankful to Mr. S.K.DAS (Executive Director-HOI-GEOPIC, ONGC,
DEHRADUN) for providing us with administrative supervision and sustained guidance,
inspiration and careful supervision during the project and allowing us to complete the Project
work in the ONGC Innovation Centre at GEOPIC Extension
We are deeply indebted to Schlumberger for technical assistance, training and guidance during
the course of the project work. We express our thanks to Mr. Vijay Raj (Geophysicist,
Schlumberger), Mr. Mahabir Prasad Mahapatra (Petrel Deployment Manager, Schlumberger)
& Mr. Jagannath Mukherjee (Information Management Geoscientist) and Mr. Pushpal
Banerjee (Technical Sales) for the healthy cooperation.
Sumit Kumar Bhattacharya
M.Sc.(tech) Geophysics,
Department of Geophysics
Banaras Hindu University,
Varanasi,U.P.
Umang Sundriyal
M.Sc.-tech. Applied Geophysics
Indian School of Mines,
Dhanbad.
Samiur Rahman Khan
M.sc.Petroleum Geosciences,
Department of Geology,
Banaras Hindu University,
Varanasi,U.P.
Mohmmad Sarim,
M.sc.Petroleum Geosciences,
Department of Geology,
Banaras Hindu University,
Varanasi, U.P
Contents:-
1. Introduction
2. About Petrel
3. Petrel Geophysics
4. Seismic Data Interpretation in Petrel
A. Data Description
B. Process of Data Loading
I. Seismic Data visualization in Petrel
II. Synthetic Generation and Seismic Well Tie Processes
III. Mis-tie Analysis for 2D and Bulk Shift for 3D
IV. Fault & Horizon Interpretation in Petrel :
A. Fault Interpretation
B. Horizon Interpretation
V. Map Generation and Plotting
VI. Seismic Attribute Analysis:
A. Seismic Attributes and its Importance
B. Seismic Attributes Generation:
a. Volume Attributes
b. Surface Attributes
C. Ant Tracking and Genetic Inversion Workflow
VII. Velocity Modelling:
VIII. Advance Seismic Interpretation:
A. Geobody Interpretation
B. Horizon/Surface& Well Probe
5. Results and Discussion
6. References
INTRODUCTION
About ONGC:-
Today, ONGC is the flagship company of India; and making this possible is a dedicated team of
nearly 33,000 professionals who toil round the clock. It is this toil which amply reflects in the
aspirations and performance figures of ONGC. The company has adopted progressive policies in
scientific planning, acquisition, utilization, training and motivation of the team. At ONGC,
everybody matters, every soul counts.
It was incorporated as Oil And Natural Gas Corporation limited under the companies Act, 1956
on 23rd
June 1993 after conversion from Oil and Natural Gas Commission, a statutory body
established on August 14th
, 1956 under ONGC Act 1956.
ONGC has scripted India‟s hydrocarbon saga by discovering 6 of the presently 7 producing
Basins of India , in last 50 years, discovering over 6.5 billions tonnes of In-place Oil & Gas. It has
recovered reserves of over 1 billion tonnes of Oil and Gas and produces more than 1.1 million
Barrels of oil and Oil Equivalent Gas (BOE) per day, meeting around 80% of India‟s domestic
production of Oil & Gas.
All the installation of ONGC are certified for Quality, Health, Safety and Environment
Management(QHSE) standards. ONGC is the first Indian company to adopt „Integrity Pact‟ to
institutionalize transparency in its procurement and public transactions. The development activity
of ONGC has been planned on sound ecological principle and incorporates appropriate
environmental safeguards.
It is the most valuable public sector company of highest profit making and highest dividend
pairing in India.
GLOBAL RANKING:
 No.1 E & P company in the world among leading global energy major as per plats 2010.
 No. 24 in PFC ENERGY 50. Ranked 24th among the global publicly – listed energy
company as per “PFC ENERGY 50” list ( January, 2010).
 ONGC ranked at 155th position in FORBES global 2000 list for 2010.
NATIONAL RANKING:
 ONGC ranked at second position in FE500 list 2010.
 Business Today Ranks ONGC as Best Company to work for in Core Sector.
VISION:-
To be a global leader in integrated energy business through sustainable growth, knowledge
excellence and exemplary governance practice.
Geodata Processing and Interpretation Centre (GEOPIC):-
Geodata Processing and Interpretation Centre was established in 1987 at ONGC, Dehradun
with a specific mandate of providing solutions to complex E&P problems by integrating seismics
with other geoscientific data. The centre has world class state-of-the-art infra-structure backed by
specialists in the field of seismic data processing, geoscientific data interpretation and supported
by software programmers and hardware engineers.
Land and marine seismic data of ONGC of Indian & Foreign basins is processed and interpreted
synergistically at this centre. Over the years, GEOPIC has processed more than 18, 00,000 LKM
of seismic data including about 12,00,000 LKM of 3D data and 1,10,000 LKM in depth domain
and interpreted about 220 numbers of 3D prospects including deep waters.
Since inception, GEOPIC transformed itself into a Centre for Excellence and today is one of the
premier institutes of ONGC.
GEOPIC carries a charter of Value Creation/Additions through specialized skills, expertise and
superior Technology and has the role: (1) Providing Time and Depth Domain Solutions to
Subsurface Geology by Seismic Data Processing, (2) Strati-Structural Earth Modelling and
Hydrocarbon Play definition in Domestic and Foreign Basins.
In line with its charter & role, GEOPIC has formulated a long term strategic planning by inducting
new technologies, developing expertise & disseminating the technologies to other work centers.
The state-of-the-art technologies inducted & absorbed over the years such as depth domain
processing, stratigraphic inversion, advanced volume based interpretation tools, stochastic
lithofacies modelling using neural network, spectral decomposition, geostatistics, modelling, etc.
has given GEOPIC, a capability to provide solutions to complex E&P problems using large 3D
data volumes in regular processing & interpretation cycles.
CORE STRENGTH:
• Seismic Imaging
• 3D Visualization
• Integrated Interpretation of Geoscientific data for E&P Solutions
TECHNOLOGICAL EDGE:
• Large volume 3D pre-stack time migration
• 3D pre-stack depth migration
• Anisotropic pre-stack depth migration
• AVO analysis
• Pre/post stack multi-volume integration
• VSP data processing
• Stratigraphic inversion modelling
• Volume based interpretation
• Geobody detection and mapping
• Optical stacking aided fault detection
• Seismic facies classification
• Geostatistical reservoir property mapping
• State-of-art basic and advance IIWS interpretation of seismic 2D/3D processed data
ABOUT ONGC INNOVATION CENTRE(OIC)
ONGC INNOVATION CENTRE(OIC) is a unique joint initiative between ONGC and
Schlumberger ,which was inaugurated on 5th
February, 2011 by Ex.Director of Exploration Shri
D.K.Pandey. It is a centre which contain latest Schlumberger‟s software and an environment for
innovating latest ideas in the area of Exploration and Production.
ABOUT SCHLUMBERGER
Schlumberger was founded in 1926 by French brothers Conrad and Marcel Schlumberger as the
Electric Prospecting Company. The company recorded the first-ever electrical resistivity well log
in Merkwiller-Pechelbronn, France in 1927.
Today Schlumberger supplies the petroleum industry with services such as seismic acquisition
and processing, formation evaluation, well testing and directional drilling, well cementing and
stimulation, artificial lift, well completions and consulting, and reservoir software and information
management.
The company is also involved in the groundwater extraction and carbon capture and storage
industries. Schlumberger Limited is world's largest oilfield services company. Schlumberger
employs over 108,000 people of more than 140 nationalities working in approximately 80
countries. Its corporate offices are in Houston, Paris, and The Hague.
2:-ABOUT PETREL(2010.2): SEISMIC TO SIMULATION
SOFTWARE
Petrel is a shared earth modelling tool bringing reservoir disciplines together through a common
data model. Geophysical, geological, petrophysical and production data can be imported,
manipulated, generated and visualized in one single platform.
Optimize Exploration and Development Operations:-
Petrel seismic to simulation software helps increase reservoir performance by improving asset
team productivity. Geophysicists, geologists, and reservoir engineers can develop collaborative
workflows and integrate operations to streamline processes. All the modules present in Petrel
2010.2 are listed below :-
Benefits:
 Unify workflows for E&P teams- Eliminate the gaps in traditional systems that require
handoffs from one technical domain to the next using Petrel model-centric workflows in a
shared earth model.
 Manage risk and uncertainty- Easily test multiple scenarios, analyze risk and uncertainty,
capture data relationships and parameters to perform rapid updates as new data arrives,
and perform detailed simulation history matching.
 Enable knowledge management and best practices- Reduce workflow learning curves by
capturing best practices via the Workflow Editor, providing quick access to preferred
workflows, and increasing ease of use through intuitive and repeatable workflows.
 Open framework- Seamlessly integrate your intellectual property into the Petrel workflow
through the open Ocean. This environment leverages .NET tools and offers stable, user-
friendly interfaces for efficient development allowing focus on innovation rather than
infrastructure.
3:-PETREL GEOPHYSICS:- Petrel is a unified seismic interpretation software,
Fully integrate with the geological and engineering tools, Petrel geophysical software allows for
rapid 2D and 3D seismic interpretation. Sample your seismic data directly into a 3D reservoir
model to predict pay, and bias reservoir property distribution using a geostatistical approach. An
extensive library of seismic attributes and volume rendering techniques can help us to identify
hydrocarbon indicators and fracture patterns. Different modules present within the Geophysics
process are listed below:
Petrel Seismic Interpretation:-
Petrel seismic interpretation software seamlessly combines the rigorous workflows of interpreting
in 2D with the visual and performance benefits that only 3D volume interpretation can provide.
You also gain the unique advantage of an interpretation environment unified with geology,
reservoir modeling, and reservoir engineering domains, giving the ability to rapidly interpret
seismic data and compare the results with other data in your project. Effortlessly moving from
interpretation to structural model building to property modeling and back eliminates the gaps and
inevitable knowledge and data loss of traditional systems that require handoffs from one technical
domain to the next.
Benefits
 Combines visual and performance advantages of 3D seismic interpretation with the
traditional 2D
 views for accuracy and detailed event picking, enabling advanced 3D visualization at
every desktop
 Enables rapid, detailed event interpretation using accurate manual and powerful
automated trackingwith a wide range of control and editing options
 Offers the unique advantage of interpreting seismic in a truly unified environment
 Provides scalable desktop interpretation from regional basin study workflows down to
reservoir detail.
4:-SEISMIC DATA INTERPRETATION IN PETREL(2010.2)
(i)DATA LOADING:-
(A).DATA DESCRIPTION:-
Introduction :- The first dataset is located in the Gulf of Mexico and contains salt induced tectonic
and stratigraphic traps towards the salt. The data consists of seismic data, well data, checkshots, a
velocity cube and stacking velocities. This dataset will be used for most of the exercises.
Geology of Gulf of Mexico :- The Gulf of Mexico is a small oceanic basin surrounded by
continental land masses. Due to their physical structure, the Gulf and the Caribbean Sea are
sometimes combined and referred to as the 'American Mediterranean'. Uchupi (1975) divides the
Gulf into two distinct geographical provinces (Terrigenous and Carbonate) while Antoine (1972)
recognizes seven. The scheme proposed by Antoine is presented here, with additional information
derived from other sources.
Gulf of Mexico Basin :-
This portion of the Gulf of Mexico contains the Sigsbee Deep and can be further divided into the
continental rise, the Sigsbee Abyssal Plain, and the Mississippi Cone. Located between the
Sigsbee escarpment and the Sigsbee Abyssal Plain, the continental rise is composed of sediments
transported to the area from the north.
The Sigsbee Abyssal Plain is a deep, flat portion of the Gulf bottom located northwest of
Campeche Bank. In this relatively uniform area of the Gulf bottom, the Sigsbee Knolls and other
small diapiric (salt) domes represent the only major topographical features.
The Mississippi Cone is composed of soft sediment and extends southeast from the Mississippi
Trough, eventually merging with other sediments of the central basin. The cone is bordered by the
DeSoto Canyon to the east and the Mississippi Trough to the west.
The second dataset from Angola contains a high sinuosity channel systems developed in drainage
areas dominated by low gradient slopes. This dataset will be used for Geobody interpretation.
Geology of Angola :-
The geology of Angola is dominated by Proterozoic rocks. Neoproterozoic rocks of the Bembe
Group occur in the western part of the country.
The „Schisto-Calcaire,‟ which is part of the Neoproterozoic West Congolian Supergroup that
outcrops mainly in the northwest of the country.
The lower part of the „Schisto-Calcaire‟ is made up of stromatolitic and oolitic limestones and
rests on tillites. In the coastal basin, a 4,000 m thick sequence of Cretaceous to Tertiary and
Quaternary sediments rests unconformably on the Precambrian. Kalahari sands cover the eastern
part of the country.
(B).PROCESS OF DATA LOADING:-At first to load a data we have to create a project
then give the coordinate system which is belongs to our field area. The coordinate system belongs
to our project is ED30UTM-31.
Now we make a folder and create a survey within it and importing the data within it.
1:-For seismic data we have to make a seismic folder and creating a seismic survey. The file type of
seismic data is in SEGY format.
The SEGY headers from first file area of the dialog gives you access to the ASCII header,
binary header and trace header information to identify byte location parameters. The file can be
scanned to check inline and cross line numbers as well as x and y coordinates. A specified number
of traces can be scanned.
For well data we have to create a well folder and import the well data. First of all we import the
well header data which is in ascii format, then the sequence of data loading is given below:-
Well data File type
1- Deviated/ well path data .dev (ascii)
2- Well log data .las, .lis, .dlis
3- Well tops data ascii
4- Well check shot/VSP data ascii
(ii) Seismic Data visualization in Petrel:-
After we have imported the seismic data, it is easy to display; either as a cube or through
intersections. For volumes, the classic seismic sections are in line, cross line, time slices,
and random line. In addition, Petrel allows specific seismic sections to be extracted along
any given polygon or along any well path. Volumes can also be rendered using filters for
extracting the preferred attribute values. This makes a very good visualization tool, as you
can play with the transparency effects and search for objects of interest with known
attribute values. There are two settings in particular in Petrel which allow you to optimize
the way the data is displayed, depending on the task at hand. The data can be stored in a
bricked format called ZGY for speedy 3D rendering (Realization), and it can be cropped
so that only the area of interest is displayed (Cropping).
 Realization :- Realization is the process of creating a physical copy of any seismic
volume or 2D line. The original seismic can be an imported SEG-Y or ZGY file, a virtual
file (for example, a cropped volume or a cropped 2D line) or even a previously realized
volume or 2D line. The output from the realization process is seismic in ZGY format (3D)
or a Petrel raw format (2D). Realization serves the following important purposes:
Purpose 1:- Realization creates a physical representation
Purpose 2:- Realization can change value resolution
Purpose 3:- Realization changes the data format to ZGY bricked format
 Cropping :- Cropping is the operation of cutting away parts of the volume, leaving a
smaller volume that is usually faster and more convenient to work with. Cropping can also
be thought of as defining a Region Of Interest (ROI). The cropping can be performed on
both 3D and 2D seismic.
Figure :- selecting the area to be cropped
Figure:- cropped 3D volume
Volume visualization:- The Style tab contains a sub-tab called Volume visualization, which
contains the settings describing what is to be displayed, that is; Volume walls and Volume
render. The color filter used for extracting the preferred amplitude values can be edited under the
Colors tab. The Style tab contains a sub-tab called Volume visualization, which contains the
settings describing what is to be displayed, that is; Volume walls and Volume render. The color
filter used for extracting the preferred amplitude values can be edited under the Colors tab.
Volume Visualization enable us extract 3D objects from seismic to improve reservoir
understanding, detect anomalies, and define facies. Petrel software enables us to interactively
blend multiple seismic volumes, isolate areas of interest, and then instantly extract what is
visualized into a 3D object called a geobody. As the geobody is extracted, the interpreter can
assign a geological template to the geobody, providing the body with instant geological meaning.
Geobodies can be included directly in the 3D geological model, bridging the gap between
geophysics and geology. There are two options are there in volume walls :-
Volume Walls-Normal :-
The visible sides of a seismic ZGY volume will be visualized and the volume will appear as a
box.
Volume Walls-Inside :-
The view will be like looking inside a box, with the outer sides stripped off. This option displays
only the distal walls and will not show any objects hidden inside the box. This can, for instance,
be combined with volume rendering which is another option of volume visualization.
Figure :- showing normal cube (volume walls)
Figure :- showing inner faces of cube
Volume Rendering:- The Volume rendering process is, unlike volume walls, a display of all data within
a seismic volume at the same time. By rendering a seismic volume, making it partly opaque (high
amplitudes) and partly transparent (crossover amplitudes), you can identify hidden structural or
depositional features. Only seismic in ZGY format can be volume rendered, implying that input SEG-Y
volumes must be realized to ZGY format before the data can be volume rendered. The seismic volume can
also be cropped down prior to realization to focus on more local (smaller) features. Figure below is shows
use of opacity in whole seismic volume.
Figure :- application of volume rendering after changing the opacity
Bump mapping :- This is another option in volume visualization which is highlight mainly the
faults in section. Figure below shows the enable bump mapping.
Figure :- seismic cube after applying the Bump mapping
There are other options are also there as given below :-
Arbitrary line :- It is used for single survey that is either for 2D or 3D survey
Figure :- Arbitrary line
Random line:- This is only applicable to 3D seismic survey. It is use mainly to generate a line
rather than inline x line and time slice.
Figure :- Random line
Composite line :- It is possible to create composites within a seismic survey or between crossing seismic
surveys in Petrel. This means that you can create composites on 3D lines within a 3D survey, lines that are
spanning across multiple 3D surveys, lines spanning over 2D and 3D lines, but also along 2D lines only.
Composite lines can also be created using any type of seismic (any attribute) that rests on the Petrel
project.
Figure :- Composite line section
2D visualization:- we can show here wiggles instead of the variable density or variable area, as shown
below :-
Figure :- seismic section in wiggle representation
(iii)SYNTHETIC GENERATION AND SEISMIC WELL TIE
PROCESSES:-
One of the first step in interpreting a seismic dataset is to establish the relationship
between seismic reflections and stratigraphy. For structural mapping, it may be sufficient
to establish approximate relationship (e.g. „reflection X is near Caracas‟), although for
more detail work on attributes, it is usually necessary to be more precise and establish
exactly how the top of a reservoir is expressed on seismic section. The best source of
stratigraphic information is well control.
The Synthetic Seismogram:
If wells have sonic and formation density logs, at least over the intervals of commercial
interest, it is possible to construct a synthetic seismogram showing the expected seismic
response for comparison with real seismic data.
The first step in making the synthetic seismogram is the multiplication of
sonic and density logs, which will give us acoustic impedance log. The next step is to
convert the acoustic impedance log, calculated from log data recorded as function of
depth, into a log as a function of (two way) travel time. This is easy if we know time-depth
relationship for the well, which can be obtained by simply integrating the sonic log. Two
problem may arise in this process. One of them is that errors (for example minor
miscalibration of the sonic tool) tend to accumulate when the log is integrated over many
thousands of feet. Another problem is that sonic log is hardly ever run in the shallowest
part of the hole. For these reasons, it is usual to calibrate the T-Z curve by means of some
direct observations of travel time from a surface source to downhole geophone (check
shots), the integrated sonic is then adjust to match these control points. A reflectivity
sequence is convolved with the wavelet thought to be present in the seismic data to
generate the synthetic seismogram.
Generation of Synthetic Seismogram in Petrel:
Synthetic seismograms are the bridges between geological information (well data in
Depth) and geophysical information (seismic in time). This essentially involves a two-
step
process.
1. Time converting the wells by means of check shot data and sonic logs, establishing time-
depth relationships for the wells.
2. Generating synthetic seismograms from density logs, sonic logs and a seismic wavelet
by calculating acoustic impedance and reflection coefficients, which are then convolved
using a wavelet.
Any changes to the time-depth relationship can be made and seismic horizons can be correlated
with the stratigraphic boundaries identified in the well logs.
Seismic Well-Tie:- The Seismic well tie process, found under Geophysics in the Processes pane, is one
out of two approaches to generating synthetic seismograms in Petrel. First step here is to build a log set and
add sonic (DT) and formation density (RHOB) log into it. The Seismic well tie process is a plug-in that
comes with the Petrel installer. The process is accessed from the Geophysics folder on the Processes pane.
Launching the process will open the Seismic well tie process dialog and give you access to the
following workflows, each sorted under its own tab:
 Sonic calibration
 Wavelet extraction
 Wavelet viewer
 Wavelet builder
Each tab contains a graphical explanation of the workflow and has a button to launch a new
workflow window at the bottom.
The same workflow windows can be accessed by using the available icons in the Function bar
when the Seismic well tie process is active. Dependent on what window is open, a set of icons to
handle the display is available.
1. Wavelet Builder: The wavelet builder workflow used in the Seismic well tie process,
is a tool to build statistical wavelets. The wavelets can be constructed as Ricker, Ormsby
or Tapered sync types. The amplitude polarity is specified here, either using the European
or US standard, as well as the Phase polarity of the wavelet. The Wavelet builder
window is accessed from the Seismic well tie process dialog, Wavelet builder tab by
clicking on the „New Wavelet Builder‟ button. If the Sonic well tie window is not open, it
can be directly accessed by using the Sonic calibration icon in the Function bar (given
that the Seismic well tie process is active).
Wavelet Builder:-
In the Name field, you can enter the wavelet name. From the Type drop-down menu, you can
select among; Ricker, Ormsby or Tapered sync type wavelets. The wavelet type specific
parameters will be activated based on the selection. The parameters are entered as numerical
values and will not take effect until the Compute button is clicked. The Phase rotation can be set
by using the slider or the increase/decrease value selector.
2. Wavelet Viewer:
The Wavelet viewer workflow used in the Seismic well tie process, is a window
designed to view any premade, loaded or already generated wavelets with the corresponding
power and phase spectra.
The Wavelet viewer window is accessed from the Seismic well tie process dialog,
Wavelet viewer tab by clicking on the „New Wavelet Viewer‟ button. If the Sonic well tie
window is not open, it can be directly accessed by using the Sonic calibration icon in the
Function bar (given that the Seismic well tie process is active).
By this facility we can view the different wavelets, which has been chosen in the
wavelet builder altogether, by choosing them from „petral explorer pane‟.
3. Sonic Calibration:-
The sonic calibration workflow used in the Seismic well tie process, includes the ability to edit a
knee curve based on time-depth information (typically check-shots data), interactively do sonic
calibration and view the resulting calibrated sonic log while editing. It is also possible to redefine
the datum (datuming) in the process and specify the output after calibration.
You can access the Sonic calibration window from the Seismic well tie process dialog and
clicking on the „sonic calibration‟ button on the Sonic calibration tab.
If the Sonic well tie window is not open, it can be directly accessed by using the Sonic
calibaration icon in the Function bar (given that the Seismic well tie process is active).
figure:- sonic calibration work flow
When selecting an appropriate log set to use in the sonic calibration workflow, the window is
populated with a sonic track and the check-shots/knee log track as well as any selected auxiliary log
tracks. The Sonic calibration variables and settings dialog will open up.
Figure :-T-D relationship from sonic log before (knee picking)
Knee Picking:- Knee picking is the actual workflow where the sonic log is calibrated to the
check-shots for the well. This is obtained by fitting the drift curve (the knee log) to the check-shot
points and/or well tops. The drift curve can interactively be changed to optimize the time-depth
relationship or the calibrated sonic log.
Figure :- T-D relationship from sonic log after Knee picking
Sonic Calibration and Display Tab: -
The log curve and data point appearance is handled from the Display tab. Also the density and
residual drift can be shown. Sonic Calibration will display up to 6 logs that are part of the LogSet
in the Input tab. Other logs that are part of the LogSet can be dropped in to the "Other" fields .
Sonic Calibration Output Tab:
Objects made in the Sonic calibration workflow are not accessible from the Input pane until the
results are output from the Seismic well tie process. The LogSet, Knee LogSet and Depth/time
relations can be obtained from the Output tab and will be stored on the Input pane under the
Well logs folder. In addition, the Sonic log, Residual drift curve and Time reference log can be
saved from here. A typical output from this workflow is the final time-depth relationship that can
be used for re-establishing the time log for the well used.
4. Wavelet Extraction:- The wavelet extraction workflow used in the Seismic well tie
process, is a tool for performing deterministic wavelet extraction by selecting the seismic volume
and input logs of interest. The position of the extraction location can be changed interactively
based on predictability to optimize on the wavelet to use. Changing the extraction location
automatically updates the extracted wavelet with its corresponding power and phase spectra, as
well as the resulting synthetic trace.
The Wavelet extraction window is accessed from the Seismic well tie process
dialog, Wavelet extraction tab by clicking on the „new wavelet extraction window‟ button.
If the Sonic well tie window is not open, it can be directly accessed by using the Wavelet
extraction icon on the Function bar (given that the Seismic well tie process is active).
Wavelet extraction workflow:-
Open a Wavelet extraction window to perform the workflow given below. we can either
use the icon from the Function bar or access it from the Seismic well tie process dialog.
The window will be empty until a Log set and a seismic volume are selected. The inputs are
selected by selecting the check box or dragging them from the Input pane. Input 1 is for
seismic and input 2 is log set. By selecting the option „position‟ we can check the inline and
crossline traces from where wavelet has been extracted. Now we can extract the wavelet, we
can also change the scan length and check the output.
Figure:-Wavelet extraction and generation of synthetic seismogram
Time Shift and Manual Adjustment:- The final synthetic trace made in the Seismic well tie process can
be bulk shifted or stretched and squeezed to align it to the real seismic data. The alignment points can be
set randomly on any strong event that needs to be adjusted and any updates are applied as soon as they are
needed. The adjusted synthetic trace can be output back to the Input pane of Petrel with a different name
than the original unadjusted trace for comparison and use.
Time shift and manual adjustment workflow:
The Wavelet extraction window is used for manual adjustment. The manually adjusted syntetic trace is
updated interactively and is saved back to the Input pane for further use. Open the Wavelet extraction
window to perform this workflow. You can either use the icon from the Function bar, access it from the
Seismic well tie process dialog, or from the Windows pane if Petrel has not been closed since the wavelet
extraction workflow was done.
Variable time shift alignment :-
Controlling the variable time shift alignment is done from the Wavelet extraction variables and settings
dialog, Time shift tab.
(iv) MIS TIE ANLYSIS FOR 2D AND BULK SHIFT FOR 3D:
Sometimes due to some errors (e.g. error in selecting the datum), there is a mismatch of
signatures at the intersection of two 2D lines. In order to make the interpretation correctly we have to go
for mis-tie analysis and match the signature of those 2D lines at the intersection. In case of 3D volume, in
Petrel we have the facility of shifting the 3D volume up or down. These processes are described below
separately:
1. Mis-Tie Analysis for 2D:- Mis-tie analysis allows us to extract and apply mis-tie value
corrections for 2D seismic surveys. All lines contained inside a Seismic survey folder or a sub-
folder will be used when a Mis-tie set is created. A simple, but effective algorithm based upon
weighting value assignments using a variance criterion has been used. It assumes mis-tie values to
be random variables. The algorithm satisfies the following requirements:-
 The mis-ties should be reduced to a minimum after the error adjustment.
 The method should be applicable to any survey configuration.
 There should be a way to define a weighting factor to good lines (reference lines) in comparison to
lines showing data of poor quality.
 The method should be easily manageable within the current workflow.
 The algorithm should be fast and require minimal iteration.
Mis-Tie Manager:- The Mis-tie manager is an interactive tool for managing the mis-ties in Petrel. You
can calculate or specify corrections, select reference lines, deselect lines not to be used for mis-tie analysis,
and also toggle to display 2D lines in the active window. To generate a Mis-tie set, right-click on a Seismic
survey folder (in this case called "2D lines for MIS-TIE") or sub-folder and choose Insert mis-tie set from
the menu. The new mis-tie set is located in a new folder called "Mis-ties", under the survey folder that it
was created from.
When a mis-tie set is created, Petrel will extract all crossing points for the 2D lines in the survey folder and
build a color-coded spreadsheet showing all 2D lines and their corresponding intersecting lines. The table
looks very similar to the Survey manager, but it has additional functionality to work with mis-tie values. It
is called Mis-tie manager.
The Mis-tie manager is opened by right-clicking on the Mis-tie set object on the tree and selecting Mis-
tie manager.
There are three different ways throgh which we can compute the mis-tie between 2D lines:
 Gain - Enable mis-tie corrections for gain (amplitude balancing).
 Vertical - Enable vertical mis-tie corrections. By Default this is enabled.
 Phase - Enable mis-tie phase corrections.
We can compute mis-tie values from the seismic data and calculate mis- tie corrections from the mis-tie
manager.
Figure :- 2D lines intersection before the application of mis-tie analysis
Figure :- Mis-Tie Analysis After Application of Gain (Amplitude
Balancing)
Figure :- Mis-Tie analysis after phase balancing
Similarly we can apply gain and phase method of mis-tie analysis simultaneously and analyse the
results.
2. Bulk Shift for 3D:
In petrel we have the facility to shift the whole volume of seismic data up or down the
time scale by some amount, which can be useful while matching a seismic volume cube
with some other seismic volume adjacent to it. This process can be performed by the
following method:
a) Go to setting of that seismic volume on which we want to introduce the bulk shift.
b) After choosing the „Geometry‟, select vertical geometry to perform the task.
c) Give a shift value, whatever is desired. To bring back the seismic to its original value
give a shift value, which is negative of the earlier you have introduced.
Figure :- Seismic volume before introducing bulk shift
Figure :- Seismic volume after the introduction of a bulk shift of 50.0 msec
(v)FAULT AND HORIZON INTERPRETATION
All interpretations in a Petrel project are stored in one or more interpretation folders. Any given
interpretation folder can contain both horizon and fault interpretations. Furthermore, interpretation
objects can be moved (drag and drop) from one folder to the other.
Fault Interpretation
Before marking the horizon we have to mark faults to show the discontinuity in the horizon.
Additionally, while creating an autopick horizon there exists a provision to stop the horizons at
faults. We can interpret faults in inline, xline , time slice as well as in random line. We can
interpret several types of faults in 2D as well as in 3D data in Petrel (2010.2) which gives the
complete picture of the orientation of fault in whole volume. In Petrel there is an advantage that
we can interpret fault in 3D as well as in 2D window so that we can make more accurate to our
interpretation. The workflow of fault interpretation is as given below :-
1:- Insert a fault within the interpretation folder
2:- select the fault option which is made and select the seismic interpretation option in geophysics
module in process pane
3:- select interpret fault option in function bar
4:- start picking the faults in section
Figure:- fault interpretation in three different windows
Figure :- fault interpretation in 3D and 2D windows
(B)Horizon interpretation
Horizon marking is having a significant importance for delineating the geology and prospectivity
of an area. Horizons can be interpreted on any type of intersection, inline, crossline, random line,
composed line, curved vertical intersections, etc. and in 3D or a traditional 2D interpretation
window. There are Several different methods are available in Petrel for interpreting seismic
horizons:
 Manual interpretation (drawing) :- In manual interpretation we can peak horizons
manually with the help of well tops or synthetic seismogram imported or generated in the
earlier steps. Manual interpretation is a rather accurate way of interpreting the horizons by
closely viewing a set of inline/crossline after certain interval. But it is some time
cumbersome to go for it since it is very time consuming.
The most common way of interpreting seismic data is to use autotracking. The user selects one or
more seed points as a basis for the interpretation and Petrel searches outwards from these seed
points, tracking the seed points features to select adjacent points which match the tracking criteria.
Seed points can be prepared in advance or picked interactively from a seismic intersection. There
are 5 types of autotracking available in Petrel:
 Guided autotracking - we select two points and the tracking will find the best route
from one to the other. This gives you a high degree of control as to how the interpretation
will develop.
 Seeded 2D autotracking:- It is very simpler way of tracking a horizon along a
section. Here just by clicking on a point (seed) the entire horizon can itself be autotracked
by some criteria such as amplitude matching i.e. Points will be tracked in the direction of
the selected line intersection.
Figure :- Seeded 2D horizon autotracking
 Seeded 3D autotracking :- Points will be tracked outwards from the seed points in
all directions. When the reflectors are of good quality this method can be a very efficient
way of interpreting through the seismic cube. If we hold down SHIFT before you start, the
seeded autotracking will display the tracking on-screen as it progresses. Figure is shown
below :-
 Paintbrush autotracking - Points will be tracked outwards from the seedpoints,
limited by the cursor box. Dragging the cursor while clicking the left mouse button will
'paint' interpretation following the cursor path. Works in both 2D and 3D windows.
Figure:- paint brush autotracking
 Active box autotracking - Points will be tracked outwards from the seed points,
limited by the extent of the box dragged over the area. Works only in a 2D window.
Figure :- active box autotracking
(vi) MAP GENERATION AND PLOTTING:
Having identified some horizons that are significant for understanding the geology and
prospectivity of an area, the next task is to map them across the survey.
In Petrel surface maps of the horizons can be generated from the process utilities. The Utilities
process folder stores all the general process tools that can be used in Petrel independent of a main
process workflow (like seismic interpretation or 3D grid modeling). At any stage in your main
process workflow, these tools can be used to create temporary data like trend surfaces, polygon
boundaries, etc.
Make/edit surfaces:
This process generates grid surfaces based on point data, line data, polygons, surfaces, bitmaps
and well tops and allows them to be edited interactively. A surface created with the Make Surface
process is a regular 2D grid.
It is important to remember that there is a distinct difference between a Surface and a Horizon in
Petrel. Data for a surface is held within a flat, regular 2D grid, whereas data for a horizon is held
in Petrel's 3D grid. Make/edit surface is therefore ideal for rapidly gridding all types of data,
Make horizons is a more complex operation performed after pillar gridding to generate
geological layers in the 3D model.
Examples of the Use of Surfaces in Petrel:
1. Input to the Make Horizon process
2. Input to the Make zones process (isochores)
3. Input to the Make Contacts process (as a contact)
4. To assign values directly in property modeling (facies modeling or petrophysical
modeling)
5. As a 2D trend in data analysis and property modeling
6. As a probability map for facies objects
7. As a surface defining variation in the size and shape of facies objects
Once a surface has been constructed using the make surface operation, it can be updated, taking
account of any changes in input data, boundaries, trends, etc. by selecting regenerate from the right
mouse button options.
Process of making surfaces:
The process dialog for the Make/edit surface process contains an Execute section and six sub-
menus. Input and output settings for the 2D surfaces are entered in the Execute section.
Additional settings for the process such as geometry, method settings, pre/post-processing and
well adjustment are defined in the sub-menus. The process is given below:
1. In this process in the form of main input we have to select a horizon along which we want
to make the map, we can also assign a name to the surface map we are generating.
2. After this we have to give values of Xmin and Xmax, if we want to take corresponding
values of the seismic horizon choose the option „get limits from the selected‟.
3. Now go to algorithm and choose a method for generating the map. The most common
method is „convergent interpolation‟.
4. Since the map has been generated now, remove inline and crossline and view the map
only. If the single color is there adjust the color according to hight by clicking „adjust color
table on selected‟ from the tool bar.
5. We can smooth the map by going to the setting of the surface
6. By opening a „New Map Window‟, we can see the created surface in a better way along
with the locations of the wells which lies in that area on the surface.
Figure :-Surface map showing the well location in a map window
(vii) SEISMIC ATTRIBUTE ANALYSIS:
A. SEISMIC ATTRIBUTES AND ITS IMPORTANCE:
A seismic attribute is any quantity derived from seismic data using measured time,
amplitude, frequency, attenuation or any combination of these. It intends to output a subset
of the data that quantifies rock and fluid properties and/or allows the recognition of
geological pattern and features. Almost all seismic attributes are post-stack but there are a
few pre-stack ones. They can be measured along a single seismic trace or throughout
various seismic traces. Advantage of applying seismic attribute on the data set is given
below:
1. Signature of subsurface features, which were hidden in the seismic data set may
become prominent after the application of certain attributes on the data set. Therefore
seismic attribute enhance the different features within the seismic data.
2. Detection of a discontinuity such as a fault, is more easy after the application of certain
attributes (e.g. variance, ant tracking etc.). Application of these attributes makes the
discontinuity more prominent and hence easy to pick.
3. Genetic inversion is a special attribute in petral, with the help of which we can create
the whole volume of a particular data, which helps in interpretation.
B. SEISMIC ATTRIBUTE GENERATION IN PETREL:-
Attribute generation in Petrel is split into two separate processes, the Volume attributes and
Surface attributes processes. They are similar in the sence that they both contain a library of
different seismic attribute classes for display and use with the seismic interpretation workflow in
Petrel. Seismic attributes help to enhance information that might be subtle in conventional
seismic, leading to a better understanding and interpretation of the data.
a)-Volume Attributes:
Volume attributes are generated, based on various properties of the analytical signal, it makes
virtual or realized (physical) volumes of the input seismic. The Volume attributes process is
opened by double-clicking on it in the Process pane. The dialog needs to be populated with input
data, attribute selection and parameters from scratch. If the process is initiated by right-clicking on
a seismic volume and selecting Volume attributes from the appearing menu, this volume is
automatically instantiated as the Input under the Input/Output tab.
Some volume attributes available in Petrel:
Some of the important volume attributes which are available in Petrel are described in brief
below:
 ENVELOP:
The total instantaneous energy of the analytic signal (the complex trace), independent of phase.
Also known as 'Instantaneous Amplitude', 'Magnitude' or 'Reflection strength'.
The f and g are the "real" and "imaginary" components of the seismic trace. So, if f is the real
part, which are just the original seismic trace samples, g will be the samples from the Hilbert
transform (also called quadrature amplitude) of the seismic trace. A window length parameter is
available (default: 33).
The envelope attribute is of importance detecting bright spots caused by gas accumulations,
detecting major lithological changes that are caused by strong energy reflections and sequence
boundaries.
The attribute clearly shows subtle lithological changes that may not be apparent on the seismic
data.
Envelope can be used to help recognize phase differences between seismic versions. The peak
energy should align independent of the phase of the data.
Figure :-volume after the application of „Envelop‟ attribute
 SWEETNESS:
Sweetness is the implementation of two combined attributes (Envelope and Instantaneous
Frequency) and is used for the identification of features where the overall energy signatures
change in the seismic data.
Sweetness is defined by the formula: Sweetness=Envelope/SQRT(Inst. Frequency)
Figure :-Seismic volume after the application of „Sweetness‟ attribute
 VARIANCE (Edge Detection):
The estimation of local variance in the signal. You can apply optional vertical smoothing
for noise reduction. It is useful for edge detection.
Figure:- Seismic volume before applying „variance‟ attribute
Figure :-Seismic Section after the application of „variance‟ attribute
Figure :-Fault tracking using „Variance‟ Attribute
 SECOND DERIVATIVE:
The second time derivative of the input seismic volume. The combination of the original
amplitude, first derivative, and second derivative allow you to express seismic interpretation
in relationship to maximum, minimums, greatest descents, and descent polarity. No
parameters required to be specify in this attribute.
Figure :-Second derivatve of inline 433
Second derivative can be used to help guide the pick by providing continuity in areas of where
reflections are poorly resolved on the raw amplitude. Lateral amplitude variations are visibly
diminished, which will make auto-tracking regional events more difficult.
a) Surface Attributes:
Surface attributes extracts seismic properties out of volumes based on analysis of the
signal shape. Surface based, that is, extraction is performed at a given level or between two levels
and output as a surface.
A surface attribute, also referred to as seismic attribute maps, is an extraction of data
from a seismic volume across a surface, within an interval or where interpretation intersects the
volume. To create a surface attribute, double-click on the Surface attributes process under
Geophysics.
The surface attribute will be created as an attribute associated with a horizon
interpretation or a surface. Any number of attributes can be assigned to the horizon
interpretation/surface as additional attributes.
Surface Attribute based on a single horizon:
1. Prepare a surface at which the seismic is to be sampled.
2. Double-click on the Surface attribute process to open the dialog.
3. Check Add to new surface. Type in an appropriate name.
4. Drop the seismic cube into the drop box.
5. Choose attribute to use (e.g. Maximum amplitude).
6. Use Single horizon as Window specification.
7. Drop in the surface as First horizon. Use 0 (zero) for search window and horizon offset.
8. Set the appropriate parameters for To event (e.g. Largest trough, below and 16 ms search
window).
9. Press OK
Figure:-„Sweetness‟applied as a surface attribute
a) ANT –TRACKING AND GENETIC INVERSION WORKFLOW:
a) ANT-TRACKING:
Ant tracking is used to extract faults from a pre-processed seismic volume. The pre-
processing could be variance or chaos combined with structural smoothing. Currently, only
realized volumes can be calculated.
Ant Tracking algorithm automatically extracts fault surfaces from fault attributes. The
algorithm uses the principles from ant colony systems to extract surfaces appearing like trends in
very noisy data. Intelligent software agents ("ants") will try to extract features in the attribute
corresponding to the expectations about the behavior of the faults. True fault information in the
attribute should fulfill these expectations and be extracted by many ants, whereas noise and
remains of reflectors should be extracted by no ants or by only single ants (in which case they will
be deleted). The approach is fully 3D and can take advantage of surface information in the
surrounding voxels. This makes it possible to derive detailed information from the attribute. By
writing the extracted surfaces back to a volume, we get what is referred to as an enhanced
attribute, or ant track cube. This cube contains only what is likely to be true fault information.
Ant Tracking workflow:
The process can be divided into four main activities: (1) seismic conditioning, (2) edge detection,
(3) edge enhancement, and (4) interactive interpretation (surface extraction). A collection of
surface segments, fault patches, can be extracted after the generation of the ant-track attribute.
This is a volume of fault surface "pieces" having a high confidence of connectedness, which can
be interactively merged into complete fault surfaces using the Automatic Fault Extraction process.
Above process is given stepwise on next page :-
1. First we apply „Structural Smoothening‟ attribute on the realized seismic cube.
2. Apply „variance‟ attribute on this structural smoothed volume.
3. Now apply „Ant tracking‟ attribute on this volume.
Ant-Tracking Attribute Parameters:
Parameters for the Ant tracking are shown in the figure below:
Initial ant boundary (number of voxels):
The Initial ant boundary parameter defines the initial distribution of agents, being a territorial
radius around each ant, defined in number of voxels, see figure below. This parameter has a huge
impact on the execution time of the algorithm, as it defines the total number of agents that are
propagated through the volume.
Initial Ant Boundary
Ant-Track Deviation:
Ant track deviation controls the maximum allowed deviation from a local maximum while
tracking. The agents assume a planar shape, and can only deviate 15 degrees from the initial
orientation. The method allows the agent to accept local maximum one voxel on either side of the
predicted position as legal. If the maximum is outside this ant track step range, the track deviation
parameter comes into play.
Ant step size (number of voxels):
This parameter defines the amount of voxels an Ant agent advances for each increment within its
searching step. Increasing this value will allow an Ant agent to search further, but it will lower the
resolution of the result.
Parameter settings:
 Minimum: 2
 Maximum: 10
 Default: 3
Illegal steps allowed:
Illegal steps allowed defines how far (as measured in steps) that an agent's track can continue
without finding an acceptable edge value. (An acceptable edge value means that a local maximum
is found.) See figure below.
Legal steps required:
Legal steps required controls how "connected" a detected edge must be to help distinguish an
edge from un-oriented noise. This parameter is used in combination with the Illegal step allowed
parameter. It is expressed in terms of the number of steps that must contain a valid edge value for
the agent to continue. See figure below.
Stop criteria:
The Ant track Stop criteria parameter can terminate an agent when the number of Illegal steps
contained in the total path exceeds the defined percentage.
Figure :-Application of Ant-Tracking on the seismic volume
b) Genetic Inversion:
A new approach to derive an Acoustic Impedance Inversion volume is proposed in Petrel. Multi
layer neural networks as well as genetic algorithm are combined together in order to provide a
robust and straight forward seismic inversion.
In the case of Genetic Inversion, the required inputs are limited to the seismic amplitude, and the
Acoustic Impedance well logs used as training data. Indeed no single unique wavelet, neither
initial property modeling are needed as inputs prior to run the inversion. A genetic algorithm
back-propagates the error in order to update the weights for the neural networks.
The advantage of this new method of generating a property estimation, is that the genetic
algorithm constrains the convergence of the inversion in a way that the chance of achieving a
global minimum error is much greater than in other previous neural network based inversions.
Thus, success is quasi absolute. In addition, another advantage of this process is that it is not only
restricted to conventional Acoustic/Elastic impedance inversion, but it can be extended to any
kind of petro-physical attribute/parameter, which is linked in a meaningful, and straightforward
way to the seismic amplitude or derived attribute data. To be more explicit, all the parameters
contained in the wave-equation are possible candidates (e.g. velocity, density, porosity, bulk
modulus...).
Genetic Inversion Parameters:
The Genetic Inversion module is located within the Volume Attribute library under the
"Stratigraphic methods" class. As for the other volume attributes, the Input/output tab defines
what input volume is used and how the result is stored. The Parameters tab defines how the neural
network and genetic algorithm learns and handles the result for each iteration.
The latter tab is divided into three sub-titles (see figure given below):
1. Learning inputs
2. Settings
3. Advanced options
1.Learning inputs:
All the inputs for the Genetic inversion will be located in the Input pane (cf. Ctrl+T) of Petrel.
 Seismic cube: We have to drop in the 3D volume you want to use for the learning step, as
well as for the inversion itself. We can choose all types of 3D cubes as input (e.g. cropped
volume, seismic attributes, SEG-Y or ZGY format...). For performance reasons, it is
recommended to use bricked volumes (ZGY format).
 Well folder: Select the global well folder or any sub-folder, containing the wells which
will be used for the learning process.
 Global well log: Select one of the logs listed within the "Global well log" folder. It must
be continuous, and have some explicit (linear or not) relationship with the Seismic cube.
 QC well folder: Select the global well folder or any sub-folder, containing the wells which
will be used as the "Blind" wells. The relationship determined by Neural Network during
the learning step will be computed at those wells so you can cross-validate the computed
property and the observed one.
2.Settings:
 Vertical range: vertical extension of the seismic sub-volume (see figure 5). Set to 50 by
default (depends on the resolution of the seismic).
 Inline half-range: horizontal half extension of the seismic sub-volume, with respect to the
inline direction. Set to 1 (cf. number of inline interval) by default (depends on the lateral
continuity of the structures with respect to the inline direction).
 Crossline half-range: horizontal half extension of the seismic sub-volume, with respect to
the crossline direction. Set to 1 (cf. number of cross-line interval) by default (depends on
the lateral continuity of the structures with respect to the crossline direction).
 Resample parameter: defines the sample increment within the seismic around the well
sample in order to create the input vector containing the seismic amplitudes for which the
learning process is computed. Set to 3 by default (depends on the sampling rate and the
resolution of the seismic). In conclusion, the higher the Resample parameter, the more
important the concentration of samples per volume unit (this parameter is driven by the
frequency content of the seismic). Figure 2 sums up schematically this option.
 Top surface/marker: select a "regular surface" for the upper limit where the learning
process is computed. We can also use well-top markers.
 Bottom surface/marker: select a "regular surface" for the lower limit where the learning
process is computed.
Figure :-A porosity log in a 3D volume after genetic inversion
(viii) Velocity modeling
It is possible to build the structural model directly in time, based on the seismic data. We can then
use our original seismic interpretation prior to any depth conversion and create a full 3D corner
point grid in time. This will reduce the uncertainty of the geophysical work. After creating the
initial Velocity Model, the depth conversion process converts the corner point grid on a node-by-
node basis. The model is converted, including all the grid pillars and faults. This process
facilitates the possibility to analyze the uncertainty in the velocities by using different velocity
setups. By reversing the process, a time grid can be built from a depth model. Thus velocity
modeling is mainly used for time to depth conversion of seismic data. Domain conversion allows
us to take data from one domain, typically seismic data in time, and convert it to another, typically
depth, to correlate it with well data and perform volume calculations. This action can be
performed at any time in the workflow (before or during model building) and exactly when it is
done will depend on the particular issues in the project. The uncertainties connected to
interpolating velocities far from well control makes domain conversion a critical step in the
modeling process that should be investigated thoroughly. The workflow of converting data
between domains within Petrel is split into two processes:
 Make velocity model: Defines how the velocity varies in space.
 Depth conversion: Uses the velocity model to move data between domains.
Domain conversion can be used to move data from time to depth or reverse, but also to move data
between two versions of the same domain. Examples would be to stretch and squeeze overlapping
seismic cubes so that they match (4D seismic) or to match a depth migrated attribute cube to an
updated model with more well control. All the domain conversion in Petrel follows a layer cake
model, for example data is shifted in the vertical direction only.
Within this process, we define the zones in space where the velocity can be described in a
common manner, and then describe the velocity model to use in each zone. For example, a set of
surfaces with constant velocities between each pair. We can also do corrections to match well data
or surfaces at this stage and incorporate it into the velocity model. The velocity model requires 4
sets of input:
 A zone description. For example, a set of surfaces in two-way time.
 A definition of the velocity model for each zone. For example, V=Vint.
 Input parameters for the velocity model. For example, a surface of Vint.
 Correction data, if required. For example, well tops for the specified zones.
Defining zones:- Velocity zones can be defined using:
 Constant values: Quick to define for a first test.
 Surfaces: Easy to build, even in the early stages of a project, and covers most
requirements. Useful for defining layers above the 3D grid such as the seabed.
 Horizons (3D grid): Requires a 3D grid, but can describe complex geometries, such as
reverse faulting and Y-truncations.
A mixture of these can be used in the same model.
Available velocity models:-
Each zone in the velocity model must have a definition of the velocities within that zone. Velocity
models available in Petrel include:
 V=Vint: At each XY location the velocity is constant through the zone.
 V=Vo+kZ: At each XY location, the velocity changes in the vertical direction by a factor
of k. Vo represents the velocity at datum, and Z the distance (in length units, not time) of
the point from datum. NB Vo is the velocity at Z=0, not the top of the zone and will
therefore be much lower than the velocities seen in the layer, possibly even negative in
extreme cases. As time and depth decrease downwards, a negative value of k results in
velocities which increase with depth. Typical values for k are between 0 and -0.2. This
velocity model is also referred to as Linvel.
 V=Vo+k(Z-Zo): As above, however, here the values are measured relative to the top of the
zone. For example, Vo represents the velocity at the top of the zone and (Z-Zo) represents
the distance between the point and the top of the zone. Again, a negative value of k will
result in velocities which increase downwards. Typical values for k are between 0 and -
0.2. This velocity model is also referred to as Adlinvel.
 V=Vo+kT: This is the same as V=V0+K*Z except that it is for conversion to the time
domain.
 Avg. cubes: These should have an attribute representing the average velocity between the
point in the cube and the datum. Cubes of instantaneous velocity, or time/depth, can be
converted to average velocity in the attribute generation process.
 Avg. property: If the 3D grid is used to define the zone, and includes a property
representing average velocity (down to the center of the grid cell), then this can be used to
depth convert the interval. Such grids can be created by sampling data into the grid or
using Data analysis and Petrophysical modeling to extrapolate from well data. This
method is useful for complex structures, such as reverse faults.
 Stack cube: This is use the stacking velocity of the data.
 Same as above: This option can be used for thin zones where extrapolating data from
within the zone alone might cause problems. It is particularly useful when interpolating V0
or k from the wells time depth relationship.
Defining velocity input:-
The methods available for defining the velocity input are dependent upon the velocity model that
has been used. When using V=Vint, V=Vo+kZ or V=Vo+k(Z-Zo), Vint or Vo and k can be entered
using a variety of methods:
 Constant: a constant value.
 Surface: a surface defining the value at each XY location. The surface must cover the
whole area of the velocity zone.
 Well TDR constant: The value will be estimated using the time depth relationship (TDR)
through the zone for each well and a single constant value used. Petrel employs a
minimum depth error method to estimate this value. See Velocity modeling algorithms
for a detailed description of the algorithm. The well TDR is defined on the Time tab of the
well settings dialog, see Time tab (Well).
 Well TDR surface: The value will be estimated using the time depth relationship (TDR)
through the zone for each well and interpolated to give a surface describing the variation
of the value across the model. Controls for the creation of this surface are found on the
Advanced tab. Petrel employs a minimum depth error method to estimate this value. See
Velocity modeling algorithms for a detailed description of the algorithm.
Here we are going to discuss stacking velocity method to compute velocity model:-
Stacking velocities can be loaded into Petrel as “Petrel points with attributes (ASCII) (*.*)”,
provided the files contain X, Y coordinate information. Additionally, a few common formats have
been added to the loader to support ASCII-Stacking Velocity files where there is only information
related to 2D and 3D surveys and the X, Y coordinates are not defined. The corresponding 2D or
3D surveys must be defined in Petrel before attempting to load these types of files. Petrel will try
to match the name of the survey and/or the names of the 2D lines. Formats supported are
“ESSOV2”, “STATOILH2” and “DISKOS”.Stacking velocities must first be converted using the
Dix formula. Initially, import the velocities using the appropriate format, then create average
velocities using the Dix conversion found under Points operations in the Operations tab. As a
result, new attributes are created for the point data set. Interval and average velocitiy attributes
positioned at the mid point between existing values are some of them. These can now be used for
various purposes in Petrel.
In the Operations Tab, it is possible to extract surface-consistent interval velocity points between a
top and a bottom layer. These scatter values can then be edited, visualized and mapped to derive
Interval velocity maps. The Interval velocity maps can be assigned in the Make velocity model
process to create a velocity model. Alternatively, these points can be sampled into a 3D grid (see
Creating a grid for velocity modeling) and further the velocity interpolated using data analysis and
petrophysical modeling processes. Once velocities have been created in the 3D grid, they can be
used directly as input to the velocity modeling process or average maps created from the velocity
properties settings and used as input. Z vs. t or Z vs. V plots can be made directly from the
stacking velocities in a function window. A function can then be created to pass through these
points and describe the relationship. If a straight line is used, then V0 and k will be reported on the
function's settings dialog on the Function tab.
How to extract interval velocity points from stacking velocities for depth conversion:-
1. Import the stacking velocities as points with attributes or any of the predefined formats.
2. Use the Dix conversion under seismic operations on the Operations tab of the points
settings dialog to create a set of average velocity points.
3. Open the settings for the point data set and go to the Operations tab.
4. Expand the Velocity conversion folder and select Interval velocity calculation
(surface,surface).
5. Make sure the average velocity attribute is used and enter the bordering surfaces as top and
base elevation. Run the operation.
6. The new datapoint set can be used to grid a surface using the Make/edit surface process.
Make sure to use the Interval velocity attribute for gridding.
7. The resulting surface can be used in the Make velocity model process.
Figure :-stacking velocity cube
Figure :- interval velocity surfaces between the surfaces
Figure:- the velocity model computed from stacking velocity
Once a velocity model has been created, it can be used to depth convert objects.
Figure :- time surfaces and it‟s corresponding depth converted surfaces
(ix) ADVANCE SEISMIC INTERPRETATION
Geobody Interpretation:-
A geobody is a 3D object extracted from a seismic volume. A geobody can contain several sub
elements (Geoblobs). We can build up our 3D object representation by repeatedly extracting many
sub elements. The geobody can be used as input to a property model, surfaces or as a voxel mask
for further seismic visualization.
Petrel Geobody interpretation (PGI) comes as a package installed on top of Petrel. It uses a state
of the art technology to identify, isolate, extract and make discreet an object seen in seismic data.
The extracted object is called a geobody. The geobody can then be used as input to property
modeling, surface generation, or further seismic visualization tasks.
Geobody extraction:-
Normally geobody (e.g. buried river channel) are not seen in seismic section very clearly,
therefore several types of attributes are apply on seismic section to visualize the interested
zone clearly. After identifying the zone of interest, geobody is extracted from seismic section.
Figure :- time slice of seismic data
Note that from above figure that the signature of buried river channel is not very much clear. If we
apply some attributes on it we can see the clear signature of buried river channel due to which the
extraction of geobody become very easy. Application of some attributes is shown below in
figures:-
Figure:- application of evelope attribute
Figure:- blending of two attributes (sweetness and envelope attributes)
A geobody is extracted based on its opacity threshold value. By default, the threshold value is set
to 20%. If a cell has an opacity threshold of less than 20%, it will not be included in the geobody.
This value can be edited by double-clicking on the Geobody interpretation process and changing
the Voxel connectivity opacity threshold value. This threshold is also visible as a dashed green
line in the opacity settings histogram. The steps of geobody extraction are given below:-
1:- after visualizing the interested zone clearly using the several types of attributes we insert a box
probe. The box probe is a rectangular volume that can be resized and tilted.
Figure:- the box probe
2:- then using the opacity option in the setting of box probe we can only visualize the interested
zone and opaque the remaining zone.
3:- now after converting the box probe into the seismic cube we choose the clipping polygon
option from function bar and extract the geobody.
Figure :- Extracted Geobody
Horizon/Saurface probe
The horizon probe is an irregular probe that follows one or two horizon interpretations/surfaces. It
is also referred to as "sculpting" in the industry. After making the horizon we can change the
corresponding horizon into the horizon/surface probe.
Figure:- horizon of which corresponding horizon probe is shown below
Figure:- Horizon Probe
Well probe
The well probe is a cylindrical volume following a well path that can be resized and prolonged.
Creating a well probe along a proposed well path can be useful for several purposes. The well
probe below is using as an Ant Tracking volume to see if there are any fault or fracture areas in
the proposed well path. Steps of creating a well probe is as given below:-
1:- first we choose that well along which we have to create the well probe
Figure:- selecting wells for making well probe
2:- Now inserting the well probe and changing it‟s opacity we can know the behavior of medium
surrounding the well i.e. either there are some type of potential zone or some fault planes along
the well path.
Figure:- well probe
Figure:- well probe before applying the opacity
Figure :- well probe after applying the opacity
6:- RESULTS AND DISCUSSION:
 Petrel allows us to analyze and visualize different types of data set (e.g. well data, check-
shot data, seismic data, well tops (stratigraphic markers), velocity data in a single
interface.
 Seismic well-tie process helps us to correlate the stratigraphic markers in the seismic
section.
 After synthetic correlation several regional stratigraphic markers (Caracus, Kobe, Paris,
etc.) have been interpreted.
 Major structures (mainly faults and salt dome) also have been interpreted to define the
structural framework of the area.
 Most of the faults are oriented in the east-west direction.
 While interpreting the horizons a sharp boundary (salt dome) has been identified, and the
major reflectors are terminating against this salt dome, it is visible by seeing the amplitude
contrast.
 During the interpretation we have generated several structural and stratigraphic attribute to
characterize the reservoir property.
 Variance and Ant-Track attributes helps us to detect the structural framework of the area.
Mainly faults and other discontinuities.
 Similarly stratigraphic attributes for e.g. sweetness, relative acoustic impedance and
genetic inversion helps us to indentify some key prospective zones.
 Same features can be confirmed by generating the surface attribute along the interested
surface.
 With the available velocity data (SEG Y cube, stacking velocity, check-shot etc.) all the
time interpretation have been converted into depth and structural maps of time and depth
surfaces have been prepared.
 Signature of the river channel in the time slice have been enhanced and visualized in a 3D
view by applying several volume attributes and generating the geobody of the river
channel.
References:-
1:- Bacon M., Simm R. and Redshaw T., 2003, 3D-Seismic Interpretation, Cambridge
University Press
2:- Coffeen, J. A. (1984), Interpretin Seismic Data, PennWell Tulsa,
3:- Brouwer J., and Helbig k., 1998 Shallow High-Resolution Reflection seismic, Handbook
of Geophysical Exploration, volume 19, Elsevier Science, Amesterdem.
4:- Dobrin M.B. & Savit C.H. (1988), Introduction to geophysical prospecting, McGraw Hills.
Inc., New York, USA, pp. 25-430.
5:-Robinson E. S., 1996 , Peacock and Treitel, 1969, Basic Exploration Geophysics.
6:- Sheriff R.E. and Geldart L.P. 1982, Exploration seismology, Cambridge: Cambridge
University Press
7:- Ricker, N., 1953, wavelet contraction, wavelet expansion, and the control of seismic
resolution: Geophysics, 18, 769-792.

Mais conteúdo relacionado

Mais procurados

Introduction to seismic interpretation
Introduction to seismic interpretationIntroduction to seismic interpretation
Introduction to seismic interpretationAmir I. Abdelaziz
 
Multicomponent Seismic Data API
Multicomponent Seismic Data APIMulticomponent Seismic Data API
Multicomponent Seismic Data APIBablu Nonia
 
Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]
Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]
Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]muhammad ali
 
Seismic data processing
Seismic data processingSeismic data processing
Seismic data processingShah Naseer
 
Seismic Attributes
Seismic AttributesSeismic Attributes
Seismic AttributesDalia Hassan
 
Geophysical Methods of Hydrocarbon Exploration
Geophysical Methods of Hydrocarbon ExplorationGeophysical Methods of Hydrocarbon Exploration
Geophysical Methods of Hydrocarbon ExplorationM.T.H Group
 
Evaluation of low resistivity zones
Evaluation of low resistivity zonesEvaluation of low resistivity zones
Evaluation of low resistivity zonesShahnawaz Mustafa
 
Reservoir Geophysics
Reservoir GeophysicsReservoir Geophysics
Reservoir GeophysicsJacob13012
 
2 D 3D_ seismic survey
2 D 3D_ seismic survey2 D 3D_ seismic survey
2 D 3D_ seismic surveyShah Naseer
 
Role of Seismic Attributes in Petroleum Exploration_30May22.pptx
Role of Seismic Attributes in Petroleum Exploration_30May22.pptxRole of Seismic Attributes in Petroleum Exploration_30May22.pptx
Role of Seismic Attributes in Petroleum Exploration_30May22.pptxNagaLakshmiVasa
 

Mais procurados (20)

3D Facies Modeling
3D Facies Modeling3D Facies Modeling
3D Facies Modeling
 
Introduction to seismic interpretation
Introduction to seismic interpretationIntroduction to seismic interpretation
Introduction to seismic interpretation
 
Multicomponent Seismic Data API
Multicomponent Seismic Data APIMulticomponent Seismic Data API
Multicomponent Seismic Data API
 
Seismic data processing
Seismic data processingSeismic data processing
Seismic data processing
 
Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]
Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]
Petrophysical analysis of reservoir rock of kadanwari gas [autosaved]
 
Reservoir mapping
Reservoir mappingReservoir mapping
Reservoir mapping
 
Reservoir Characterization
Reservoir CharacterizationReservoir Characterization
Reservoir Characterization
 
Seismic data processing
Seismic data processingSeismic data processing
Seismic data processing
 
Seismic Attributes
Seismic AttributesSeismic Attributes
Seismic Attributes
 
Geophysical Methods of Hydrocarbon Exploration
Geophysical Methods of Hydrocarbon ExplorationGeophysical Methods of Hydrocarbon Exploration
Geophysical Methods of Hydrocarbon Exploration
 
Petrophysic cont
Petrophysic contPetrophysic cont
Petrophysic cont
 
Image logs
Image logsImage logs
Image logs
 
Rock Physics: Definitions
Rock Physics: DefinitionsRock Physics: Definitions
Rock Physics: Definitions
 
Evaluation of low resistivity zones
Evaluation of low resistivity zonesEvaluation of low resistivity zones
Evaluation of low resistivity zones
 
Reservoir Geophysics
Reservoir GeophysicsReservoir Geophysics
Reservoir Geophysics
 
2 D 3D_ seismic survey
2 D 3D_ seismic survey2 D 3D_ seismic survey
2 D 3D_ seismic survey
 
Neutron log
Neutron logNeutron log
Neutron log
 
Reservoir modeling and characterization
Reservoir modeling and characterizationReservoir modeling and characterization
Reservoir modeling and characterization
 
Well logging
Well loggingWell logging
Well logging
 
Role of Seismic Attributes in Petroleum Exploration_30May22.pptx
Role of Seismic Attributes in Petroleum Exploration_30May22.pptxRole of Seismic Attributes in Petroleum Exploration_30May22.pptx
Role of Seismic Attributes in Petroleum Exploration_30May22.pptx
 

Destaque

Reservoir Modeling with Petrel
Reservoir Modeling with PetrelReservoir Modeling with Petrel
Reservoir Modeling with Petreladeeba setu
 
ICWES15 - Undergraduate Research Initiative at a Community College. Presented...
ICWES15 - Undergraduate Research Initiative at a Community College. Presented...ICWES15 - Undergraduate Research Initiative at a Community College. Presented...
ICWES15 - Undergraduate Research Initiative at a Community College. Presented...Engineers Australia
 
Blc2010 Undergrad Research Heitz
Blc2010 Undergrad Research HeitzBlc2010 Undergrad Research Heitz
Blc2010 Undergrad Research HeitzRobin Heyden
 
Petroleum Seismology Notes
Petroleum Seismology NotesPetroleum Seismology Notes
Petroleum Seismology NotesAli Osman Öncel
 
Sustainable soil management
Sustainable soil managementSustainable soil management
Sustainable soil managementAndre Benedito
 
Group 4 project students
Group 4 project studentsGroup 4 project students
Group 4 project studentsChris Hamper
 
Mozambique Coal 2016_Geology
Mozambique Coal 2016_GeologyMozambique Coal 2016_Geology
Mozambique Coal 2016_GeologyAndy Lloyd
 
A Seismic Oil and Gas Primer
A Seismic Oil and Gas PrimerA Seismic Oil and Gas Primer
A Seismic Oil and Gas PrimerKevin Hill
 
Exploration and analysis of oil and gas field ( 3D seismic survey)
Exploration and analysis of oil and gas field ( 3D seismic survey)Exploration and analysis of oil and gas field ( 3D seismic survey)
Exploration and analysis of oil and gas field ( 3D seismic survey)Apurva Mittal
 
Soil management
Soil managementSoil management
Soil managementAnnKoenig
 
Recruitment and selection(dimension group)
Recruitment and selection(dimension group)Recruitment and selection(dimension group)
Recruitment and selection(dimension group)Praveen Bhardwaj
 
Reservoir modeling work flow chart
Reservoir modeling work flow chartReservoir modeling work flow chart
Reservoir modeling work flow chartDr. Arzu Javadova
 
AEC Excellence Awards 2016 Finalists
AEC Excellence Awards 2016 FinalistsAEC Excellence Awards 2016 Finalists
AEC Excellence Awards 2016 FinalistsAutodesk AEC
 

Destaque (20)

Reservoir Modeling with Petrel
Reservoir Modeling with PetrelReservoir Modeling with Petrel
Reservoir Modeling with Petrel
 
ICWES15 - Undergraduate Research Initiative at a Community College. Presented...
ICWES15 - Undergraduate Research Initiative at a Community College. Presented...ICWES15 - Undergraduate Research Initiative at a Community College. Presented...
ICWES15 - Undergraduate Research Initiative at a Community College. Presented...
 
DEGAN_CV_4p1000
DEGAN_CV_4p1000DEGAN_CV_4p1000
DEGAN_CV_4p1000
 
Blc2010 Undergrad Research Heitz
Blc2010 Undergrad Research HeitzBlc2010 Undergrad Research Heitz
Blc2010 Undergrad Research Heitz
 
Paradigm online university_dec-2015
Paradigm online university_dec-2015Paradigm online university_dec-2015
Paradigm online university_dec-2015
 
Petroleum Seismology Notes
Petroleum Seismology NotesPetroleum Seismology Notes
Petroleum Seismology Notes
 
Sustainable soil management
Sustainable soil managementSustainable soil management
Sustainable soil management
 
Group 4 project students
Group 4 project studentsGroup 4 project students
Group 4 project students
 
Mozambique Coal 2016_Geology
Mozambique Coal 2016_GeologyMozambique Coal 2016_Geology
Mozambique Coal 2016_Geology
 
Soil management
Soil managementSoil management
Soil management
 
Recruitment Project
Recruitment ProjectRecruitment Project
Recruitment Project
 
A Seismic Oil and Gas Primer
A Seismic Oil and Gas PrimerA Seismic Oil and Gas Primer
A Seismic Oil and Gas Primer
 
Group 4 project
Group 4 projectGroup 4 project
Group 4 project
 
Seismic Surveys in Oil and Gas Exploration
Seismic Surveys in Oil and Gas ExplorationSeismic Surveys in Oil and Gas Exploration
Seismic Surveys in Oil and Gas Exploration
 
Exploration and analysis of oil and gas field ( 3D seismic survey)
Exploration and analysis of oil and gas field ( 3D seismic survey)Exploration and analysis of oil and gas field ( 3D seismic survey)
Exploration and analysis of oil and gas field ( 3D seismic survey)
 
Soil management
Soil managementSoil management
Soil management
 
Recruitment and selection(dimension group)
Recruitment and selection(dimension group)Recruitment and selection(dimension group)
Recruitment and selection(dimension group)
 
Reservoir modeling work flow chart
Reservoir modeling work flow chartReservoir modeling work flow chart
Reservoir modeling work flow chart
 
AEC Excellence Awards 2016 Finalists
AEC Excellence Awards 2016 FinalistsAEC Excellence Awards 2016 Finalists
AEC Excellence Awards 2016 Finalists
 
Well logging
Well loggingWell logging
Well logging
 

Semelhante a Summer training report_Samiur Rahman Khan_Geologist

FINAL COMPENDIUM.compressed (2)
FINAL COMPENDIUM.compressed (2)FINAL COMPENDIUM.compressed (2)
FINAL COMPENDIUM.compressed (2)Narasimha Sharma
 
Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892
Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892
Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892MarkGirouard3
 
SyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir Study
SyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir StudySyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir Study
SyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir StudySyQwest Inc.
 
TECHNICAL APPENDIX TO THE RESUME
TECHNICAL APPENDIX TO THE RESUMETECHNICAL APPENDIX TO THE RESUME
TECHNICAL APPENDIX TO THE RESUMERouma Dupeuple
 
Anju Mary Raju (cv) - (2)
Anju Mary Raju (cv) - (2)Anju Mary Raju (cv) - (2)
Anju Mary Raju (cv) - (2)Anju Mary Raju
 
Resume Anuj Dixit
Resume Anuj DixitResume Anuj Dixit
Resume Anuj DixitAnuj Dixit
 
Andrew_Sitek_RESUME_FINAL
Andrew_Sitek_RESUME_FINALAndrew_Sitek_RESUME_FINAL
Andrew_Sitek_RESUME_FINALAndrew Sitek
 
Saurabh Gupta Ctran Consulting Pvt. Ltd
Saurabh Gupta Ctran Consulting Pvt. LtdSaurabh Gupta Ctran Consulting Pvt. Ltd
Saurabh Gupta Ctran Consulting Pvt. LtdIIT Kharagpur
 
weather radar Report
weather radar Reportweather radar Report
weather radar ReportPawankumarm65
 
Final reprt at ntpc vindhyanagar , singrauli
Final  reprt at ntpc vindhyanagar , singrauliFinal  reprt at ntpc vindhyanagar , singrauli
Final reprt at ntpc vindhyanagar , singrauliDevanshu Yadav
 

Semelhante a Summer training report_Samiur Rahman Khan_Geologist (20)

DGH Report
DGH ReportDGH Report
DGH Report
 
FINAL COMPENDIUM.compressed (2)
FINAL COMPENDIUM.compressed (2)FINAL COMPENDIUM.compressed (2)
FINAL COMPENDIUM.compressed (2)
 
CV of JC Saha for Power Plant 290516
CV of JC Saha for Power Plant 290516CV of JC Saha for Power Plant 290516
CV of JC Saha for Power Plant 290516
 
Souvenir
SouvenirSouvenir
Souvenir
 
soil manual
soil manualsoil manual
soil manual
 
DR_R_MANIVANAN_CV
DR_R_MANIVANAN_CVDR_R_MANIVANAN_CV
DR_R_MANIVANAN_CV
 
KMac_CV_November 2016
KMac_CV_November 2016KMac_CV_November 2016
KMac_CV_November 2016
 
legtc_resume
legtc_resumelegtc_resume
legtc_resume
 
Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892
Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892
Syqwestsinglebeamechosounderstudy 150528174118-lva1-app6892
 
SyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir Study
SyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir StudySyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir Study
SyQwest Bathy-2010 Sub Bottom Profiler used in Tarbela Reservoir Study
 
TECHNICAL APPENDIX TO THE RESUME
TECHNICAL APPENDIX TO THE RESUMETECHNICAL APPENDIX TO THE RESUME
TECHNICAL APPENDIX TO THE RESUME
 
Anju Mary Raju (cv) - (2)
Anju Mary Raju (cv) - (2)Anju Mary Raju (cv) - (2)
Anju Mary Raju (cv) - (2)
 
Resume Anuj Dixit
Resume Anuj DixitResume Anuj Dixit
Resume Anuj Dixit
 
Mine closure planning ppt_Atun
Mine closure planning ppt_AtunMine closure planning ppt_Atun
Mine closure planning ppt_Atun
 
Andrew_Sitek_RESUME_FINAL
Andrew_Sitek_RESUME_FINALAndrew_Sitek_RESUME_FINAL
Andrew_Sitek_RESUME_FINAL
 
Saurabh Gupta Ctran Consulting Pvt. Ltd
Saurabh Gupta Ctran Consulting Pvt. LtdSaurabh Gupta Ctran Consulting Pvt. Ltd
Saurabh Gupta Ctran Consulting Pvt. Ltd
 
weather radar Report
weather radar Reportweather radar Report
weather radar Report
 
saurabh resume
saurabh resumesaurabh resume
saurabh resume
 
Erakv news2
Erakv news2Erakv news2
Erakv news2
 
Final reprt at ntpc vindhyanagar , singrauli
Final  reprt at ntpc vindhyanagar , singrauliFinal  reprt at ntpc vindhyanagar , singrauli
Final reprt at ntpc vindhyanagar , singrauli
 

Summer training report_Samiur Rahman Khan_Geologist

  • 1. OIL AND NATURAL GAS LIMITED Project Report on Interpretation of Geological and Geophysical data In Schlumberger’s Petrel Software Submitted TO:- Submitted BY:- Mr. S.K.DAS Sumit Kumar Bhattacharya Executive Director-HOI M.Sc.(tech.) Geophysics, Final Year GEOPIC Banaras Hindu University,Varanasi ONGC , Dehradun Umang Sundriyal M.Sc.-tech. (Applied Geophysics),Final Year, Indian School of Mines, Dhanbad. Samiur Rahman Khan M.Sc. Petroleum Geosciences, Final Year, Banaras Hindu University,Varanasi. Mohammad Sarim M.Sc. Petroleum Geosciences, Final Year, Banaras Hindu University,Varanasi.
  • 2. CERTIFICATE This is to certify that the summer training report entitled “Interpretation of Geological and Geophysical data in Schlumberger‟s Petrel Software” submitted to GEOPIC , ONGC, Dehradun is a record of bonafied work carried out by Mr. Sumit Kumar Bhattacharya, final year student of M.sc.(tech.) Geophysics, Department of Geophysics, Banaras Hindu University, Varanasi, Mr. Umang Sundriyal, final year student of M.sc.-tech. (Applied Geophysics), Indian School of Mines, Dhanbad, Mr. Samiur Rahman Khan, final year student of M.sc.Petroleum Geosciences, Department of Geology, Banaras Hindu University and Mr. Mohmmad Sarim, final year student of M.sc.Petroleum Geosciences, Department of Geology, Banaras Hindu University, Varanasi, U.P, under my supervision. Supervisor :- Mr. S.K.DAS Executive Director HOI, GEOPIC ONGC , Dehradun
  • 3. ACKNOWLEDGEMENT We are thankful to ONGC for providing us with the opportunity for our Industrial Training in the academic year 2010-2011. We are sincerely thankful to Mr. S.K.DAS (Executive Director-HOI-GEOPIC, ONGC, DEHRADUN) for providing us with administrative supervision and sustained guidance, inspiration and careful supervision during the project and allowing us to complete the Project work in the ONGC Innovation Centre at GEOPIC Extension We are deeply indebted to Schlumberger for technical assistance, training and guidance during the course of the project work. We express our thanks to Mr. Vijay Raj (Geophysicist, Schlumberger), Mr. Mahabir Prasad Mahapatra (Petrel Deployment Manager, Schlumberger) & Mr. Jagannath Mukherjee (Information Management Geoscientist) and Mr. Pushpal Banerjee (Technical Sales) for the healthy cooperation. Sumit Kumar Bhattacharya M.Sc.(tech) Geophysics, Department of Geophysics Banaras Hindu University, Varanasi,U.P. Umang Sundriyal M.Sc.-tech. Applied Geophysics Indian School of Mines, Dhanbad. Samiur Rahman Khan M.sc.Petroleum Geosciences, Department of Geology, Banaras Hindu University, Varanasi,U.P. Mohmmad Sarim, M.sc.Petroleum Geosciences, Department of Geology, Banaras Hindu University, Varanasi, U.P
  • 4. Contents:- 1. Introduction 2. About Petrel 3. Petrel Geophysics 4. Seismic Data Interpretation in Petrel A. Data Description B. Process of Data Loading I. Seismic Data visualization in Petrel II. Synthetic Generation and Seismic Well Tie Processes III. Mis-tie Analysis for 2D and Bulk Shift for 3D IV. Fault & Horizon Interpretation in Petrel : A. Fault Interpretation B. Horizon Interpretation V. Map Generation and Plotting VI. Seismic Attribute Analysis: A. Seismic Attributes and its Importance B. Seismic Attributes Generation: a. Volume Attributes b. Surface Attributes C. Ant Tracking and Genetic Inversion Workflow VII. Velocity Modelling: VIII. Advance Seismic Interpretation: A. Geobody Interpretation B. Horizon/Surface& Well Probe 5. Results and Discussion 6. References
  • 5. INTRODUCTION About ONGC:- Today, ONGC is the flagship company of India; and making this possible is a dedicated team of nearly 33,000 professionals who toil round the clock. It is this toil which amply reflects in the aspirations and performance figures of ONGC. The company has adopted progressive policies in scientific planning, acquisition, utilization, training and motivation of the team. At ONGC, everybody matters, every soul counts. It was incorporated as Oil And Natural Gas Corporation limited under the companies Act, 1956 on 23rd June 1993 after conversion from Oil and Natural Gas Commission, a statutory body established on August 14th , 1956 under ONGC Act 1956. ONGC has scripted India‟s hydrocarbon saga by discovering 6 of the presently 7 producing Basins of India , in last 50 years, discovering over 6.5 billions tonnes of In-place Oil & Gas. It has recovered reserves of over 1 billion tonnes of Oil and Gas and produces more than 1.1 million Barrels of oil and Oil Equivalent Gas (BOE) per day, meeting around 80% of India‟s domestic production of Oil & Gas. All the installation of ONGC are certified for Quality, Health, Safety and Environment Management(QHSE) standards. ONGC is the first Indian company to adopt „Integrity Pact‟ to institutionalize transparency in its procurement and public transactions. The development activity of ONGC has been planned on sound ecological principle and incorporates appropriate environmental safeguards. It is the most valuable public sector company of highest profit making and highest dividend pairing in India. GLOBAL RANKING:  No.1 E & P company in the world among leading global energy major as per plats 2010.  No. 24 in PFC ENERGY 50. Ranked 24th among the global publicly – listed energy company as per “PFC ENERGY 50” list ( January, 2010).  ONGC ranked at 155th position in FORBES global 2000 list for 2010. NATIONAL RANKING:  ONGC ranked at second position in FE500 list 2010.  Business Today Ranks ONGC as Best Company to work for in Core Sector. VISION:- To be a global leader in integrated energy business through sustainable growth, knowledge excellence and exemplary governance practice.
  • 6. Geodata Processing and Interpretation Centre (GEOPIC):- Geodata Processing and Interpretation Centre was established in 1987 at ONGC, Dehradun with a specific mandate of providing solutions to complex E&P problems by integrating seismics with other geoscientific data. The centre has world class state-of-the-art infra-structure backed by specialists in the field of seismic data processing, geoscientific data interpretation and supported by software programmers and hardware engineers. Land and marine seismic data of ONGC of Indian & Foreign basins is processed and interpreted synergistically at this centre. Over the years, GEOPIC has processed more than 18, 00,000 LKM of seismic data including about 12,00,000 LKM of 3D data and 1,10,000 LKM in depth domain and interpreted about 220 numbers of 3D prospects including deep waters. Since inception, GEOPIC transformed itself into a Centre for Excellence and today is one of the premier institutes of ONGC. GEOPIC carries a charter of Value Creation/Additions through specialized skills, expertise and superior Technology and has the role: (1) Providing Time and Depth Domain Solutions to Subsurface Geology by Seismic Data Processing, (2) Strati-Structural Earth Modelling and Hydrocarbon Play definition in Domestic and Foreign Basins. In line with its charter & role, GEOPIC has formulated a long term strategic planning by inducting new technologies, developing expertise & disseminating the technologies to other work centers. The state-of-the-art technologies inducted & absorbed over the years such as depth domain processing, stratigraphic inversion, advanced volume based interpretation tools, stochastic lithofacies modelling using neural network, spectral decomposition, geostatistics, modelling, etc. has given GEOPIC, a capability to provide solutions to complex E&P problems using large 3D data volumes in regular processing & interpretation cycles. CORE STRENGTH: • Seismic Imaging • 3D Visualization • Integrated Interpretation of Geoscientific data for E&P Solutions TECHNOLOGICAL EDGE: • Large volume 3D pre-stack time migration • 3D pre-stack depth migration • Anisotropic pre-stack depth migration • AVO analysis • Pre/post stack multi-volume integration • VSP data processing • Stratigraphic inversion modelling • Volume based interpretation • Geobody detection and mapping • Optical stacking aided fault detection • Seismic facies classification • Geostatistical reservoir property mapping • State-of-art basic and advance IIWS interpretation of seismic 2D/3D processed data
  • 7. ABOUT ONGC INNOVATION CENTRE(OIC) ONGC INNOVATION CENTRE(OIC) is a unique joint initiative between ONGC and Schlumberger ,which was inaugurated on 5th February, 2011 by Ex.Director of Exploration Shri D.K.Pandey. It is a centre which contain latest Schlumberger‟s software and an environment for innovating latest ideas in the area of Exploration and Production. ABOUT SCHLUMBERGER Schlumberger was founded in 1926 by French brothers Conrad and Marcel Schlumberger as the Electric Prospecting Company. The company recorded the first-ever electrical resistivity well log in Merkwiller-Pechelbronn, France in 1927. Today Schlumberger supplies the petroleum industry with services such as seismic acquisition and processing, formation evaluation, well testing and directional drilling, well cementing and stimulation, artificial lift, well completions and consulting, and reservoir software and information management. The company is also involved in the groundwater extraction and carbon capture and storage industries. Schlumberger Limited is world's largest oilfield services company. Schlumberger employs over 108,000 people of more than 140 nationalities working in approximately 80 countries. Its corporate offices are in Houston, Paris, and The Hague.
  • 8. 2:-ABOUT PETREL(2010.2): SEISMIC TO SIMULATION SOFTWARE Petrel is a shared earth modelling tool bringing reservoir disciplines together through a common data model. Geophysical, geological, petrophysical and production data can be imported, manipulated, generated and visualized in one single platform. Optimize Exploration and Development Operations:- Petrel seismic to simulation software helps increase reservoir performance by improving asset team productivity. Geophysicists, geologists, and reservoir engineers can develop collaborative workflows and integrate operations to streamline processes. All the modules present in Petrel 2010.2 are listed below :- Benefits:
  • 9.  Unify workflows for E&P teams- Eliminate the gaps in traditional systems that require handoffs from one technical domain to the next using Petrel model-centric workflows in a shared earth model.  Manage risk and uncertainty- Easily test multiple scenarios, analyze risk and uncertainty, capture data relationships and parameters to perform rapid updates as new data arrives, and perform detailed simulation history matching.  Enable knowledge management and best practices- Reduce workflow learning curves by capturing best practices via the Workflow Editor, providing quick access to preferred workflows, and increasing ease of use through intuitive and repeatable workflows.  Open framework- Seamlessly integrate your intellectual property into the Petrel workflow through the open Ocean. This environment leverages .NET tools and offers stable, user- friendly interfaces for efficient development allowing focus on innovation rather than infrastructure. 3:-PETREL GEOPHYSICS:- Petrel is a unified seismic interpretation software, Fully integrate with the geological and engineering tools, Petrel geophysical software allows for rapid 2D and 3D seismic interpretation. Sample your seismic data directly into a 3D reservoir model to predict pay, and bias reservoir property distribution using a geostatistical approach. An extensive library of seismic attributes and volume rendering techniques can help us to identify hydrocarbon indicators and fracture patterns. Different modules present within the Geophysics process are listed below: Petrel Seismic Interpretation:- Petrel seismic interpretation software seamlessly combines the rigorous workflows of interpreting in 2D with the visual and performance benefits that only 3D volume interpretation can provide. You also gain the unique advantage of an interpretation environment unified with geology, reservoir modeling, and reservoir engineering domains, giving the ability to rapidly interpret seismic data and compare the results with other data in your project. Effortlessly moving from interpretation to structural model building to property modeling and back eliminates the gaps and inevitable knowledge and data loss of traditional systems that require handoffs from one technical domain to the next. Benefits
  • 10.  Combines visual and performance advantages of 3D seismic interpretation with the traditional 2D  views for accuracy and detailed event picking, enabling advanced 3D visualization at every desktop  Enables rapid, detailed event interpretation using accurate manual and powerful automated trackingwith a wide range of control and editing options  Offers the unique advantage of interpreting seismic in a truly unified environment  Provides scalable desktop interpretation from regional basin study workflows down to reservoir detail.
  • 11. 4:-SEISMIC DATA INTERPRETATION IN PETREL(2010.2) (i)DATA LOADING:- (A).DATA DESCRIPTION:- Introduction :- The first dataset is located in the Gulf of Mexico and contains salt induced tectonic and stratigraphic traps towards the salt. The data consists of seismic data, well data, checkshots, a velocity cube and stacking velocities. This dataset will be used for most of the exercises. Geology of Gulf of Mexico :- The Gulf of Mexico is a small oceanic basin surrounded by continental land masses. Due to their physical structure, the Gulf and the Caribbean Sea are sometimes combined and referred to as the 'American Mediterranean'. Uchupi (1975) divides the Gulf into two distinct geographical provinces (Terrigenous and Carbonate) while Antoine (1972) recognizes seven. The scheme proposed by Antoine is presented here, with additional information derived from other sources. Gulf of Mexico Basin :- This portion of the Gulf of Mexico contains the Sigsbee Deep and can be further divided into the continental rise, the Sigsbee Abyssal Plain, and the Mississippi Cone. Located between the Sigsbee escarpment and the Sigsbee Abyssal Plain, the continental rise is composed of sediments transported to the area from the north. The Sigsbee Abyssal Plain is a deep, flat portion of the Gulf bottom located northwest of Campeche Bank. In this relatively uniform area of the Gulf bottom, the Sigsbee Knolls and other small diapiric (salt) domes represent the only major topographical features. The Mississippi Cone is composed of soft sediment and extends southeast from the Mississippi Trough, eventually merging with other sediments of the central basin. The cone is bordered by the DeSoto Canyon to the east and the Mississippi Trough to the west. The second dataset from Angola contains a high sinuosity channel systems developed in drainage areas dominated by low gradient slopes. This dataset will be used for Geobody interpretation. Geology of Angola :- The geology of Angola is dominated by Proterozoic rocks. Neoproterozoic rocks of the Bembe Group occur in the western part of the country. The „Schisto-Calcaire,‟ which is part of the Neoproterozoic West Congolian Supergroup that outcrops mainly in the northwest of the country. The lower part of the „Schisto-Calcaire‟ is made up of stromatolitic and oolitic limestones and rests on tillites. In the coastal basin, a 4,000 m thick sequence of Cretaceous to Tertiary and Quaternary sediments rests unconformably on the Precambrian. Kalahari sands cover the eastern part of the country.
  • 12. (B).PROCESS OF DATA LOADING:-At first to load a data we have to create a project then give the coordinate system which is belongs to our field area. The coordinate system belongs to our project is ED30UTM-31. Now we make a folder and create a survey within it and importing the data within it.
  • 13. 1:-For seismic data we have to make a seismic folder and creating a seismic survey. The file type of seismic data is in SEGY format.
  • 14.
  • 15. The SEGY headers from first file area of the dialog gives you access to the ASCII header, binary header and trace header information to identify byte location parameters. The file can be scanned to check inline and cross line numbers as well as x and y coordinates. A specified number of traces can be scanned. For well data we have to create a well folder and import the well data. First of all we import the well header data which is in ascii format, then the sequence of data loading is given below:- Well data File type 1- Deviated/ well path data .dev (ascii) 2- Well log data .las, .lis, .dlis 3- Well tops data ascii
  • 16. 4- Well check shot/VSP data ascii (ii) Seismic Data visualization in Petrel:- After we have imported the seismic data, it is easy to display; either as a cube or through intersections. For volumes, the classic seismic sections are in line, cross line, time slices, and random line. In addition, Petrel allows specific seismic sections to be extracted along any given polygon or along any well path. Volumes can also be rendered using filters for extracting the preferred attribute values. This makes a very good visualization tool, as you can play with the transparency effects and search for objects of interest with known attribute values. There are two settings in particular in Petrel which allow you to optimize the way the data is displayed, depending on the task at hand. The data can be stored in a
  • 17. bricked format called ZGY for speedy 3D rendering (Realization), and it can be cropped so that only the area of interest is displayed (Cropping).  Realization :- Realization is the process of creating a physical copy of any seismic volume or 2D line. The original seismic can be an imported SEG-Y or ZGY file, a virtual file (for example, a cropped volume or a cropped 2D line) or even a previously realized volume or 2D line. The output from the realization process is seismic in ZGY format (3D) or a Petrel raw format (2D). Realization serves the following important purposes: Purpose 1:- Realization creates a physical representation Purpose 2:- Realization can change value resolution Purpose 3:- Realization changes the data format to ZGY bricked format  Cropping :- Cropping is the operation of cutting away parts of the volume, leaving a smaller volume that is usually faster and more convenient to work with. Cropping can also
  • 18. be thought of as defining a Region Of Interest (ROI). The cropping can be performed on both 3D and 2D seismic. Figure :- selecting the area to be cropped
  • 19. Figure:- cropped 3D volume Volume visualization:- The Style tab contains a sub-tab called Volume visualization, which contains the settings describing what is to be displayed, that is; Volume walls and Volume render. The color filter used for extracting the preferred amplitude values can be edited under the Colors tab. The Style tab contains a sub-tab called Volume visualization, which contains the settings describing what is to be displayed, that is; Volume walls and Volume render. The color filter used for extracting the preferred amplitude values can be edited under the Colors tab. Volume Visualization enable us extract 3D objects from seismic to improve reservoir understanding, detect anomalies, and define facies. Petrel software enables us to interactively blend multiple seismic volumes, isolate areas of interest, and then instantly extract what is visualized into a 3D object called a geobody. As the geobody is extracted, the interpreter can assign a geological template to the geobody, providing the body with instant geological meaning. Geobodies can be included directly in the 3D geological model, bridging the gap between geophysics and geology. There are two options are there in volume walls :- Volume Walls-Normal :- The visible sides of a seismic ZGY volume will be visualized and the volume will appear as a box. Volume Walls-Inside :-
  • 20. The view will be like looking inside a box, with the outer sides stripped off. This option displays only the distal walls and will not show any objects hidden inside the box. This can, for instance, be combined with volume rendering which is another option of volume visualization.
  • 21. Figure :- showing normal cube (volume walls)
  • 22. Figure :- showing inner faces of cube Volume Rendering:- The Volume rendering process is, unlike volume walls, a display of all data within a seismic volume at the same time. By rendering a seismic volume, making it partly opaque (high amplitudes) and partly transparent (crossover amplitudes), you can identify hidden structural or depositional features. Only seismic in ZGY format can be volume rendered, implying that input SEG-Y volumes must be realized to ZGY format before the data can be volume rendered. The seismic volume can also be cropped down prior to realization to focus on more local (smaller) features. Figure below is shows use of opacity in whole seismic volume.
  • 23. Figure :- application of volume rendering after changing the opacity Bump mapping :- This is another option in volume visualization which is highlight mainly the faults in section. Figure below shows the enable bump mapping.
  • 24. Figure :- seismic cube after applying the Bump mapping There are other options are also there as given below :- Arbitrary line :- It is used for single survey that is either for 2D or 3D survey Figure :- Arbitrary line Random line:- This is only applicable to 3D seismic survey. It is use mainly to generate a line rather than inline x line and time slice.
  • 25. Figure :- Random line Composite line :- It is possible to create composites within a seismic survey or between crossing seismic surveys in Petrel. This means that you can create composites on 3D lines within a 3D survey, lines that are spanning across multiple 3D surveys, lines spanning over 2D and 3D lines, but also along 2D lines only. Composite lines can also be created using any type of seismic (any attribute) that rests on the Petrel project.
  • 26. Figure :- Composite line section 2D visualization:- we can show here wiggles instead of the variable density or variable area, as shown below :- Figure :- seismic section in wiggle representation
  • 27. (iii)SYNTHETIC GENERATION AND SEISMIC WELL TIE PROCESSES:- One of the first step in interpreting a seismic dataset is to establish the relationship between seismic reflections and stratigraphy. For structural mapping, it may be sufficient to establish approximate relationship (e.g. „reflection X is near Caracas‟), although for more detail work on attributes, it is usually necessary to be more precise and establish exactly how the top of a reservoir is expressed on seismic section. The best source of stratigraphic information is well control. The Synthetic Seismogram: If wells have sonic and formation density logs, at least over the intervals of commercial interest, it is possible to construct a synthetic seismogram showing the expected seismic response for comparison with real seismic data. The first step in making the synthetic seismogram is the multiplication of sonic and density logs, which will give us acoustic impedance log. The next step is to convert the acoustic impedance log, calculated from log data recorded as function of depth, into a log as a function of (two way) travel time. This is easy if we know time-depth relationship for the well, which can be obtained by simply integrating the sonic log. Two problem may arise in this process. One of them is that errors (for example minor miscalibration of the sonic tool) tend to accumulate when the log is integrated over many thousands of feet. Another problem is that sonic log is hardly ever run in the shallowest part of the hole. For these reasons, it is usual to calibrate the T-Z curve by means of some direct observations of travel time from a surface source to downhole geophone (check shots), the integrated sonic is then adjust to match these control points. A reflectivity sequence is convolved with the wavelet thought to be present in the seismic data to generate the synthetic seismogram. Generation of Synthetic Seismogram in Petrel: Synthetic seismograms are the bridges between geological information (well data in Depth) and geophysical information (seismic in time). This essentially involves a two- step process. 1. Time converting the wells by means of check shot data and sonic logs, establishing time- depth relationships for the wells. 2. Generating synthetic seismograms from density logs, sonic logs and a seismic wavelet by calculating acoustic impedance and reflection coefficients, which are then convolved using a wavelet. Any changes to the time-depth relationship can be made and seismic horizons can be correlated with the stratigraphic boundaries identified in the well logs. Seismic Well-Tie:- The Seismic well tie process, found under Geophysics in the Processes pane, is one out of two approaches to generating synthetic seismograms in Petrel. First step here is to build a log set and add sonic (DT) and formation density (RHOB) log into it. The Seismic well tie process is a plug-in that comes with the Petrel installer. The process is accessed from the Geophysics folder on the Processes pane.
  • 28. Launching the process will open the Seismic well tie process dialog and give you access to the following workflows, each sorted under its own tab:  Sonic calibration  Wavelet extraction  Wavelet viewer  Wavelet builder Each tab contains a graphical explanation of the workflow and has a button to launch a new workflow window at the bottom. The same workflow windows can be accessed by using the available icons in the Function bar when the Seismic well tie process is active. Dependent on what window is open, a set of icons to handle the display is available. 1. Wavelet Builder: The wavelet builder workflow used in the Seismic well tie process, is a tool to build statistical wavelets. The wavelets can be constructed as Ricker, Ormsby or Tapered sync types. The amplitude polarity is specified here, either using the European or US standard, as well as the Phase polarity of the wavelet. The Wavelet builder window is accessed from the Seismic well tie process dialog, Wavelet builder tab by clicking on the „New Wavelet Builder‟ button. If the Sonic well tie window is not open, it can be directly accessed by using the Sonic calibration icon in the Function bar (given that the Seismic well tie process is active).
  • 29. Wavelet Builder:- In the Name field, you can enter the wavelet name. From the Type drop-down menu, you can select among; Ricker, Ormsby or Tapered sync type wavelets. The wavelet type specific parameters will be activated based on the selection. The parameters are entered as numerical values and will not take effect until the Compute button is clicked. The Phase rotation can be set by using the slider or the increase/decrease value selector.
  • 30. 2. Wavelet Viewer: The Wavelet viewer workflow used in the Seismic well tie process, is a window designed to view any premade, loaded or already generated wavelets with the corresponding power and phase spectra. The Wavelet viewer window is accessed from the Seismic well tie process dialog, Wavelet viewer tab by clicking on the „New Wavelet Viewer‟ button. If the Sonic well tie window is not open, it can be directly accessed by using the Sonic calibration icon in the Function bar (given that the Seismic well tie process is active). By this facility we can view the different wavelets, which has been chosen in the wavelet builder altogether, by choosing them from „petral explorer pane‟. 3. Sonic Calibration:- The sonic calibration workflow used in the Seismic well tie process, includes the ability to edit a knee curve based on time-depth information (typically check-shots data), interactively do sonic calibration and view the resulting calibrated sonic log while editing. It is also possible to redefine the datum (datuming) in the process and specify the output after calibration. You can access the Sonic calibration window from the Seismic well tie process dialog and clicking on the „sonic calibration‟ button on the Sonic calibration tab. If the Sonic well tie window is not open, it can be directly accessed by using the Sonic calibaration icon in the Function bar (given that the Seismic well tie process is active).
  • 31. figure:- sonic calibration work flow When selecting an appropriate log set to use in the sonic calibration workflow, the window is populated with a sonic track and the check-shots/knee log track as well as any selected auxiliary log tracks. The Sonic calibration variables and settings dialog will open up. Figure :-T-D relationship from sonic log before (knee picking)
  • 32. Knee Picking:- Knee picking is the actual workflow where the sonic log is calibrated to the check-shots for the well. This is obtained by fitting the drift curve (the knee log) to the check-shot points and/or well tops. The drift curve can interactively be changed to optimize the time-depth relationship or the calibrated sonic log. Figure :- T-D relationship from sonic log after Knee picking Sonic Calibration and Display Tab: - The log curve and data point appearance is handled from the Display tab. Also the density and residual drift can be shown. Sonic Calibration will display up to 6 logs that are part of the LogSet in the Input tab. Other logs that are part of the LogSet can be dropped in to the "Other" fields .
  • 33. Sonic Calibration Output Tab: Objects made in the Sonic calibration workflow are not accessible from the Input pane until the results are output from the Seismic well tie process. The LogSet, Knee LogSet and Depth/time relations can be obtained from the Output tab and will be stored on the Input pane under the Well logs folder. In addition, the Sonic log, Residual drift curve and Time reference log can be saved from here. A typical output from this workflow is the final time-depth relationship that can be used for re-establishing the time log for the well used. 4. Wavelet Extraction:- The wavelet extraction workflow used in the Seismic well tie process, is a tool for performing deterministic wavelet extraction by selecting the seismic volume and input logs of interest. The position of the extraction location can be changed interactively based on predictability to optimize on the wavelet to use. Changing the extraction location automatically updates the extracted wavelet with its corresponding power and phase spectra, as well as the resulting synthetic trace. The Wavelet extraction window is accessed from the Seismic well tie process dialog, Wavelet extraction tab by clicking on the „new wavelet extraction window‟ button.
  • 34. If the Sonic well tie window is not open, it can be directly accessed by using the Wavelet extraction icon on the Function bar (given that the Seismic well tie process is active). Wavelet extraction workflow:- Open a Wavelet extraction window to perform the workflow given below. we can either use the icon from the Function bar or access it from the Seismic well tie process dialog. The window will be empty until a Log set and a seismic volume are selected. The inputs are selected by selecting the check box or dragging them from the Input pane. Input 1 is for seismic and input 2 is log set. By selecting the option „position‟ we can check the inline and crossline traces from where wavelet has been extracted. Now we can extract the wavelet, we can also change the scan length and check the output. Figure:-Wavelet extraction and generation of synthetic seismogram Time Shift and Manual Adjustment:- The final synthetic trace made in the Seismic well tie process can be bulk shifted or stretched and squeezed to align it to the real seismic data. The alignment points can be set randomly on any strong event that needs to be adjusted and any updates are applied as soon as they are needed. The adjusted synthetic trace can be output back to the Input pane of Petrel with a different name than the original unadjusted trace for comparison and use. Time shift and manual adjustment workflow: The Wavelet extraction window is used for manual adjustment. The manually adjusted syntetic trace is updated interactively and is saved back to the Input pane for further use. Open the Wavelet extraction window to perform this workflow. You can either use the icon from the Function bar, access it from the
  • 35. Seismic well tie process dialog, or from the Windows pane if Petrel has not been closed since the wavelet extraction workflow was done. Variable time shift alignment :- Controlling the variable time shift alignment is done from the Wavelet extraction variables and settings dialog, Time shift tab. (iv) MIS TIE ANLYSIS FOR 2D AND BULK SHIFT FOR 3D: Sometimes due to some errors (e.g. error in selecting the datum), there is a mismatch of signatures at the intersection of two 2D lines. In order to make the interpretation correctly we have to go for mis-tie analysis and match the signature of those 2D lines at the intersection. In case of 3D volume, in Petrel we have the facility of shifting the 3D volume up or down. These processes are described below separately: 1. Mis-Tie Analysis for 2D:- Mis-tie analysis allows us to extract and apply mis-tie value corrections for 2D seismic surveys. All lines contained inside a Seismic survey folder or a sub- folder will be used when a Mis-tie set is created. A simple, but effective algorithm based upon weighting value assignments using a variance criterion has been used. It assumes mis-tie values to be random variables. The algorithm satisfies the following requirements:-  The mis-ties should be reduced to a minimum after the error adjustment.  The method should be applicable to any survey configuration.  There should be a way to define a weighting factor to good lines (reference lines) in comparison to lines showing data of poor quality.  The method should be easily manageable within the current workflow.  The algorithm should be fast and require minimal iteration. Mis-Tie Manager:- The Mis-tie manager is an interactive tool for managing the mis-ties in Petrel. You can calculate or specify corrections, select reference lines, deselect lines not to be used for mis-tie analysis, and also toggle to display 2D lines in the active window. To generate a Mis-tie set, right-click on a Seismic survey folder (in this case called "2D lines for MIS-TIE") or sub-folder and choose Insert mis-tie set from the menu. The new mis-tie set is located in a new folder called "Mis-ties", under the survey folder that it was created from.
  • 36. When a mis-tie set is created, Petrel will extract all crossing points for the 2D lines in the survey folder and build a color-coded spreadsheet showing all 2D lines and their corresponding intersecting lines. The table looks very similar to the Survey manager, but it has additional functionality to work with mis-tie values. It is called Mis-tie manager. The Mis-tie manager is opened by right-clicking on the Mis-tie set object on the tree and selecting Mis- tie manager. There are three different ways throgh which we can compute the mis-tie between 2D lines:  Gain - Enable mis-tie corrections for gain (amplitude balancing).  Vertical - Enable vertical mis-tie corrections. By Default this is enabled.  Phase - Enable mis-tie phase corrections. We can compute mis-tie values from the seismic data and calculate mis- tie corrections from the mis-tie manager.
  • 37. Figure :- 2D lines intersection before the application of mis-tie analysis Figure :- Mis-Tie Analysis After Application of Gain (Amplitude Balancing)
  • 38. Figure :- Mis-Tie analysis after phase balancing Similarly we can apply gain and phase method of mis-tie analysis simultaneously and analyse the results. 2. Bulk Shift for 3D: In petrel we have the facility to shift the whole volume of seismic data up or down the time scale by some amount, which can be useful while matching a seismic volume cube with some other seismic volume adjacent to it. This process can be performed by the following method: a) Go to setting of that seismic volume on which we want to introduce the bulk shift. b) After choosing the „Geometry‟, select vertical geometry to perform the task. c) Give a shift value, whatever is desired. To bring back the seismic to its original value give a shift value, which is negative of the earlier you have introduced.
  • 39. Figure :- Seismic volume before introducing bulk shift Figure :- Seismic volume after the introduction of a bulk shift of 50.0 msec (v)FAULT AND HORIZON INTERPRETATION
  • 40. All interpretations in a Petrel project are stored in one or more interpretation folders. Any given interpretation folder can contain both horizon and fault interpretations. Furthermore, interpretation objects can be moved (drag and drop) from one folder to the other. Fault Interpretation Before marking the horizon we have to mark faults to show the discontinuity in the horizon. Additionally, while creating an autopick horizon there exists a provision to stop the horizons at faults. We can interpret faults in inline, xline , time slice as well as in random line. We can interpret several types of faults in 2D as well as in 3D data in Petrel (2010.2) which gives the complete picture of the orientation of fault in whole volume. In Petrel there is an advantage that we can interpret fault in 3D as well as in 2D window so that we can make more accurate to our interpretation. The workflow of fault interpretation is as given below :- 1:- Insert a fault within the interpretation folder 2:- select the fault option which is made and select the seismic interpretation option in geophysics module in process pane 3:- select interpret fault option in function bar 4:- start picking the faults in section
  • 41. Figure:- fault interpretation in three different windows
  • 42. Figure :- fault interpretation in 3D and 2D windows (B)Horizon interpretation Horizon marking is having a significant importance for delineating the geology and prospectivity of an area. Horizons can be interpreted on any type of intersection, inline, crossline, random line, composed line, curved vertical intersections, etc. and in 3D or a traditional 2D interpretation window. There are Several different methods are available in Petrel for interpreting seismic horizons:  Manual interpretation (drawing) :- In manual interpretation we can peak horizons manually with the help of well tops or synthetic seismogram imported or generated in the earlier steps. Manual interpretation is a rather accurate way of interpreting the horizons by closely viewing a set of inline/crossline after certain interval. But it is some time cumbersome to go for it since it is very time consuming. The most common way of interpreting seismic data is to use autotracking. The user selects one or more seed points as a basis for the interpretation and Petrel searches outwards from these seed points, tracking the seed points features to select adjacent points which match the tracking criteria. Seed points can be prepared in advance or picked interactively from a seismic intersection. There are 5 types of autotracking available in Petrel:  Guided autotracking - we select two points and the tracking will find the best route from one to the other. This gives you a high degree of control as to how the interpretation will develop.  Seeded 2D autotracking:- It is very simpler way of tracking a horizon along a section. Here just by clicking on a point (seed) the entire horizon can itself be autotracked by some criteria such as amplitude matching i.e. Points will be tracked in the direction of the selected line intersection.
  • 43. Figure :- Seeded 2D horizon autotracking  Seeded 3D autotracking :- Points will be tracked outwards from the seed points in all directions. When the reflectors are of good quality this method can be a very efficient way of interpreting through the seismic cube. If we hold down SHIFT before you start, the seeded autotracking will display the tracking on-screen as it progresses. Figure is shown below :-
  • 44.  Paintbrush autotracking - Points will be tracked outwards from the seedpoints, limited by the cursor box. Dragging the cursor while clicking the left mouse button will 'paint' interpretation following the cursor path. Works in both 2D and 3D windows. Figure:- paint brush autotracking
  • 45.  Active box autotracking - Points will be tracked outwards from the seed points, limited by the extent of the box dragged over the area. Works only in a 2D window. Figure :- active box autotracking (vi) MAP GENERATION AND PLOTTING: Having identified some horizons that are significant for understanding the geology and prospectivity of an area, the next task is to map them across the survey. In Petrel surface maps of the horizons can be generated from the process utilities. The Utilities process folder stores all the general process tools that can be used in Petrel independent of a main process workflow (like seismic interpretation or 3D grid modeling). At any stage in your main process workflow, these tools can be used to create temporary data like trend surfaces, polygon boundaries, etc. Make/edit surfaces: This process generates grid surfaces based on point data, line data, polygons, surfaces, bitmaps and well tops and allows them to be edited interactively. A surface created with the Make Surface process is a regular 2D grid. It is important to remember that there is a distinct difference between a Surface and a Horizon in Petrel. Data for a surface is held within a flat, regular 2D grid, whereas data for a horizon is held in Petrel's 3D grid. Make/edit surface is therefore ideal for rapidly gridding all types of data, Make horizons is a more complex operation performed after pillar gridding to generate geological layers in the 3D model. Examples of the Use of Surfaces in Petrel: 1. Input to the Make Horizon process
  • 46. 2. Input to the Make zones process (isochores) 3. Input to the Make Contacts process (as a contact) 4. To assign values directly in property modeling (facies modeling or petrophysical modeling) 5. As a 2D trend in data analysis and property modeling 6. As a probability map for facies objects 7. As a surface defining variation in the size and shape of facies objects Once a surface has been constructed using the make surface operation, it can be updated, taking account of any changes in input data, boundaries, trends, etc. by selecting regenerate from the right mouse button options. Process of making surfaces: The process dialog for the Make/edit surface process contains an Execute section and six sub- menus. Input and output settings for the 2D surfaces are entered in the Execute section. Additional settings for the process such as geometry, method settings, pre/post-processing and well adjustment are defined in the sub-menus. The process is given below: 1. In this process in the form of main input we have to select a horizon along which we want to make the map, we can also assign a name to the surface map we are generating. 2. After this we have to give values of Xmin and Xmax, if we want to take corresponding values of the seismic horizon choose the option „get limits from the selected‟. 3. Now go to algorithm and choose a method for generating the map. The most common method is „convergent interpolation‟.
  • 47. 4. Since the map has been generated now, remove inline and crossline and view the map only. If the single color is there adjust the color according to hight by clicking „adjust color table on selected‟ from the tool bar.
  • 48. 5. We can smooth the map by going to the setting of the surface 6. By opening a „New Map Window‟, we can see the created surface in a better way along with the locations of the wells which lies in that area on the surface. Figure :-Surface map showing the well location in a map window
  • 49. (vii) SEISMIC ATTRIBUTE ANALYSIS: A. SEISMIC ATTRIBUTES AND ITS IMPORTANCE: A seismic attribute is any quantity derived from seismic data using measured time, amplitude, frequency, attenuation or any combination of these. It intends to output a subset of the data that quantifies rock and fluid properties and/or allows the recognition of geological pattern and features. Almost all seismic attributes are post-stack but there are a few pre-stack ones. They can be measured along a single seismic trace or throughout various seismic traces. Advantage of applying seismic attribute on the data set is given below: 1. Signature of subsurface features, which were hidden in the seismic data set may become prominent after the application of certain attributes on the data set. Therefore seismic attribute enhance the different features within the seismic data. 2. Detection of a discontinuity such as a fault, is more easy after the application of certain attributes (e.g. variance, ant tracking etc.). Application of these attributes makes the discontinuity more prominent and hence easy to pick. 3. Genetic inversion is a special attribute in petral, with the help of which we can create the whole volume of a particular data, which helps in interpretation. B. SEISMIC ATTRIBUTE GENERATION IN PETREL:- Attribute generation in Petrel is split into two separate processes, the Volume attributes and Surface attributes processes. They are similar in the sence that they both contain a library of different seismic attribute classes for display and use with the seismic interpretation workflow in Petrel. Seismic attributes help to enhance information that might be subtle in conventional seismic, leading to a better understanding and interpretation of the data. a)-Volume Attributes: Volume attributes are generated, based on various properties of the analytical signal, it makes virtual or realized (physical) volumes of the input seismic. The Volume attributes process is opened by double-clicking on it in the Process pane. The dialog needs to be populated with input data, attribute selection and parameters from scratch. If the process is initiated by right-clicking on a seismic volume and selecting Volume attributes from the appearing menu, this volume is automatically instantiated as the Input under the Input/Output tab. Some volume attributes available in Petrel: Some of the important volume attributes which are available in Petrel are described in brief below:  ENVELOP: The total instantaneous energy of the analytic signal (the complex trace), independent of phase. Also known as 'Instantaneous Amplitude', 'Magnitude' or 'Reflection strength'. The f and g are the "real" and "imaginary" components of the seismic trace. So, if f is the real part, which are just the original seismic trace samples, g will be the samples from the Hilbert
  • 50. transform (also called quadrature amplitude) of the seismic trace. A window length parameter is available (default: 33). The envelope attribute is of importance detecting bright spots caused by gas accumulations, detecting major lithological changes that are caused by strong energy reflections and sequence boundaries. The attribute clearly shows subtle lithological changes that may not be apparent on the seismic data. Envelope can be used to help recognize phase differences between seismic versions. The peak energy should align independent of the phase of the data. Figure :-volume after the application of „Envelop‟ attribute  SWEETNESS: Sweetness is the implementation of two combined attributes (Envelope and Instantaneous Frequency) and is used for the identification of features where the overall energy signatures change in the seismic data. Sweetness is defined by the formula: Sweetness=Envelope/SQRT(Inst. Frequency)
  • 51. Figure :-Seismic volume after the application of „Sweetness‟ attribute  VARIANCE (Edge Detection): The estimation of local variance in the signal. You can apply optional vertical smoothing for noise reduction. It is useful for edge detection.
  • 52. Figure:- Seismic volume before applying „variance‟ attribute Figure :-Seismic Section after the application of „variance‟ attribute Figure :-Fault tracking using „Variance‟ Attribute
  • 53.  SECOND DERIVATIVE: The second time derivative of the input seismic volume. The combination of the original amplitude, first derivative, and second derivative allow you to express seismic interpretation in relationship to maximum, minimums, greatest descents, and descent polarity. No parameters required to be specify in this attribute. Figure :-Second derivatve of inline 433 Second derivative can be used to help guide the pick by providing continuity in areas of where reflections are poorly resolved on the raw amplitude. Lateral amplitude variations are visibly diminished, which will make auto-tracking regional events more difficult. a) Surface Attributes: Surface attributes extracts seismic properties out of volumes based on analysis of the signal shape. Surface based, that is, extraction is performed at a given level or between two levels and output as a surface. A surface attribute, also referred to as seismic attribute maps, is an extraction of data from a seismic volume across a surface, within an interval or where interpretation intersects the volume. To create a surface attribute, double-click on the Surface attributes process under Geophysics. The surface attribute will be created as an attribute associated with a horizon interpretation or a surface. Any number of attributes can be assigned to the horizon interpretation/surface as additional attributes. Surface Attribute based on a single horizon: 1. Prepare a surface at which the seismic is to be sampled. 2. Double-click on the Surface attribute process to open the dialog. 3. Check Add to new surface. Type in an appropriate name. 4. Drop the seismic cube into the drop box. 5. Choose attribute to use (e.g. Maximum amplitude). 6. Use Single horizon as Window specification. 7. Drop in the surface as First horizon. Use 0 (zero) for search window and horizon offset.
  • 54. 8. Set the appropriate parameters for To event (e.g. Largest trough, below and 16 ms search window). 9. Press OK Figure:-„Sweetness‟applied as a surface attribute a) ANT –TRACKING AND GENETIC INVERSION WORKFLOW: a) ANT-TRACKING: Ant tracking is used to extract faults from a pre-processed seismic volume. The pre- processing could be variance or chaos combined with structural smoothing. Currently, only realized volumes can be calculated. Ant Tracking algorithm automatically extracts fault surfaces from fault attributes. The algorithm uses the principles from ant colony systems to extract surfaces appearing like trends in very noisy data. Intelligent software agents ("ants") will try to extract features in the attribute corresponding to the expectations about the behavior of the faults. True fault information in the attribute should fulfill these expectations and be extracted by many ants, whereas noise and remains of reflectors should be extracted by no ants or by only single ants (in which case they will be deleted). The approach is fully 3D and can take advantage of surface information in the surrounding voxels. This makes it possible to derive detailed information from the attribute. By writing the extracted surfaces back to a volume, we get what is referred to as an enhanced attribute, or ant track cube. This cube contains only what is likely to be true fault information.
  • 55. Ant Tracking workflow: The process can be divided into four main activities: (1) seismic conditioning, (2) edge detection, (3) edge enhancement, and (4) interactive interpretation (surface extraction). A collection of surface segments, fault patches, can be extracted after the generation of the ant-track attribute. This is a volume of fault surface "pieces" having a high confidence of connectedness, which can be interactively merged into complete fault surfaces using the Automatic Fault Extraction process. Above process is given stepwise on next page :- 1. First we apply „Structural Smoothening‟ attribute on the realized seismic cube. 2. Apply „variance‟ attribute on this structural smoothed volume. 3. Now apply „Ant tracking‟ attribute on this volume. Ant-Tracking Attribute Parameters: Parameters for the Ant tracking are shown in the figure below:
  • 56. Initial ant boundary (number of voxels): The Initial ant boundary parameter defines the initial distribution of agents, being a territorial radius around each ant, defined in number of voxels, see figure below. This parameter has a huge impact on the execution time of the algorithm, as it defines the total number of agents that are propagated through the volume. Initial Ant Boundary Ant-Track Deviation: Ant track deviation controls the maximum allowed deviation from a local maximum while tracking. The agents assume a planar shape, and can only deviate 15 degrees from the initial orientation. The method allows the agent to accept local maximum one voxel on either side of the
  • 57. predicted position as legal. If the maximum is outside this ant track step range, the track deviation parameter comes into play. Ant step size (number of voxels): This parameter defines the amount of voxels an Ant agent advances for each increment within its searching step. Increasing this value will allow an Ant agent to search further, but it will lower the resolution of the result. Parameter settings:  Minimum: 2  Maximum: 10  Default: 3 Illegal steps allowed: Illegal steps allowed defines how far (as measured in steps) that an agent's track can continue without finding an acceptable edge value. (An acceptable edge value means that a local maximum is found.) See figure below.
  • 58. Legal steps required: Legal steps required controls how "connected" a detected edge must be to help distinguish an edge from un-oriented noise. This parameter is used in combination with the Illegal step allowed parameter. It is expressed in terms of the number of steps that must contain a valid edge value for the agent to continue. See figure below. Stop criteria: The Ant track Stop criteria parameter can terminate an agent when the number of Illegal steps contained in the total path exceeds the defined percentage.
  • 59. Figure :-Application of Ant-Tracking on the seismic volume b) Genetic Inversion: A new approach to derive an Acoustic Impedance Inversion volume is proposed in Petrel. Multi layer neural networks as well as genetic algorithm are combined together in order to provide a robust and straight forward seismic inversion. In the case of Genetic Inversion, the required inputs are limited to the seismic amplitude, and the Acoustic Impedance well logs used as training data. Indeed no single unique wavelet, neither initial property modeling are needed as inputs prior to run the inversion. A genetic algorithm back-propagates the error in order to update the weights for the neural networks. The advantage of this new method of generating a property estimation, is that the genetic algorithm constrains the convergence of the inversion in a way that the chance of achieving a global minimum error is much greater than in other previous neural network based inversions. Thus, success is quasi absolute. In addition, another advantage of this process is that it is not only restricted to conventional Acoustic/Elastic impedance inversion, but it can be extended to any kind of petro-physical attribute/parameter, which is linked in a meaningful, and straightforward way to the seismic amplitude or derived attribute data. To be more explicit, all the parameters contained in the wave-equation are possible candidates (e.g. velocity, density, porosity, bulk modulus...). Genetic Inversion Parameters: The Genetic Inversion module is located within the Volume Attribute library under the "Stratigraphic methods" class. As for the other volume attributes, the Input/output tab defines what input volume is used and how the result is stored. The Parameters tab defines how the neural network and genetic algorithm learns and handles the result for each iteration. The latter tab is divided into three sub-titles (see figure given below): 1. Learning inputs 2. Settings
  • 60. 3. Advanced options 1.Learning inputs: All the inputs for the Genetic inversion will be located in the Input pane (cf. Ctrl+T) of Petrel.  Seismic cube: We have to drop in the 3D volume you want to use for the learning step, as well as for the inversion itself. We can choose all types of 3D cubes as input (e.g. cropped volume, seismic attributes, SEG-Y or ZGY format...). For performance reasons, it is recommended to use bricked volumes (ZGY format).  Well folder: Select the global well folder or any sub-folder, containing the wells which will be used for the learning process.  Global well log: Select one of the logs listed within the "Global well log" folder. It must be continuous, and have some explicit (linear or not) relationship with the Seismic cube.  QC well folder: Select the global well folder or any sub-folder, containing the wells which will be used as the "Blind" wells. The relationship determined by Neural Network during the learning step will be computed at those wells so you can cross-validate the computed property and the observed one. 2.Settings:  Vertical range: vertical extension of the seismic sub-volume (see figure 5). Set to 50 by default (depends on the resolution of the seismic).  Inline half-range: horizontal half extension of the seismic sub-volume, with respect to the inline direction. Set to 1 (cf. number of inline interval) by default (depends on the lateral continuity of the structures with respect to the inline direction).  Crossline half-range: horizontal half extension of the seismic sub-volume, with respect to the crossline direction. Set to 1 (cf. number of cross-line interval) by default (depends on the lateral continuity of the structures with respect to the crossline direction).
  • 61.  Resample parameter: defines the sample increment within the seismic around the well sample in order to create the input vector containing the seismic amplitudes for which the learning process is computed. Set to 3 by default (depends on the sampling rate and the resolution of the seismic). In conclusion, the higher the Resample parameter, the more important the concentration of samples per volume unit (this parameter is driven by the frequency content of the seismic). Figure 2 sums up schematically this option.  Top surface/marker: select a "regular surface" for the upper limit where the learning process is computed. We can also use well-top markers.  Bottom surface/marker: select a "regular surface" for the lower limit where the learning process is computed. Figure :-A porosity log in a 3D volume after genetic inversion (viii) Velocity modeling It is possible to build the structural model directly in time, based on the seismic data. We can then use our original seismic interpretation prior to any depth conversion and create a full 3D corner point grid in time. This will reduce the uncertainty of the geophysical work. After creating the initial Velocity Model, the depth conversion process converts the corner point grid on a node-by- node basis. The model is converted, including all the grid pillars and faults. This process facilitates the possibility to analyze the uncertainty in the velocities by using different velocity setups. By reversing the process, a time grid can be built from a depth model. Thus velocity modeling is mainly used for time to depth conversion of seismic data. Domain conversion allows us to take data from one domain, typically seismic data in time, and convert it to another, typically
  • 62. depth, to correlate it with well data and perform volume calculations. This action can be performed at any time in the workflow (before or during model building) and exactly when it is done will depend on the particular issues in the project. The uncertainties connected to interpolating velocities far from well control makes domain conversion a critical step in the modeling process that should be investigated thoroughly. The workflow of converting data between domains within Petrel is split into two processes:  Make velocity model: Defines how the velocity varies in space.  Depth conversion: Uses the velocity model to move data between domains. Domain conversion can be used to move data from time to depth or reverse, but also to move data between two versions of the same domain. Examples would be to stretch and squeeze overlapping seismic cubes so that they match (4D seismic) or to match a depth migrated attribute cube to an updated model with more well control. All the domain conversion in Petrel follows a layer cake model, for example data is shifted in the vertical direction only. Within this process, we define the zones in space where the velocity can be described in a common manner, and then describe the velocity model to use in each zone. For example, a set of surfaces with constant velocities between each pair. We can also do corrections to match well data or surfaces at this stage and incorporate it into the velocity model. The velocity model requires 4 sets of input:  A zone description. For example, a set of surfaces in two-way time.  A definition of the velocity model for each zone. For example, V=Vint.  Input parameters for the velocity model. For example, a surface of Vint.  Correction data, if required. For example, well tops for the specified zones. Defining zones:- Velocity zones can be defined using:  Constant values: Quick to define for a first test.  Surfaces: Easy to build, even in the early stages of a project, and covers most requirements. Useful for defining layers above the 3D grid such as the seabed.  Horizons (3D grid): Requires a 3D grid, but can describe complex geometries, such as reverse faulting and Y-truncations. A mixture of these can be used in the same model. Available velocity models:- Each zone in the velocity model must have a definition of the velocities within that zone. Velocity models available in Petrel include:  V=Vint: At each XY location the velocity is constant through the zone.  V=Vo+kZ: At each XY location, the velocity changes in the vertical direction by a factor of k. Vo represents the velocity at datum, and Z the distance (in length units, not time) of the point from datum. NB Vo is the velocity at Z=0, not the top of the zone and will therefore be much lower than the velocities seen in the layer, possibly even negative in extreme cases. As time and depth decrease downwards, a negative value of k results in velocities which increase with depth. Typical values for k are between 0 and -0.2. This velocity model is also referred to as Linvel.  V=Vo+k(Z-Zo): As above, however, here the values are measured relative to the top of the zone. For example, Vo represents the velocity at the top of the zone and (Z-Zo) represents
  • 63. the distance between the point and the top of the zone. Again, a negative value of k will result in velocities which increase downwards. Typical values for k are between 0 and - 0.2. This velocity model is also referred to as Adlinvel.  V=Vo+kT: This is the same as V=V0+K*Z except that it is for conversion to the time domain.  Avg. cubes: These should have an attribute representing the average velocity between the point in the cube and the datum. Cubes of instantaneous velocity, or time/depth, can be converted to average velocity in the attribute generation process.  Avg. property: If the 3D grid is used to define the zone, and includes a property representing average velocity (down to the center of the grid cell), then this can be used to depth convert the interval. Such grids can be created by sampling data into the grid or using Data analysis and Petrophysical modeling to extrapolate from well data. This method is useful for complex structures, such as reverse faults.  Stack cube: This is use the stacking velocity of the data.  Same as above: This option can be used for thin zones where extrapolating data from within the zone alone might cause problems. It is particularly useful when interpolating V0 or k from the wells time depth relationship. Defining velocity input:- The methods available for defining the velocity input are dependent upon the velocity model that has been used. When using V=Vint, V=Vo+kZ or V=Vo+k(Z-Zo), Vint or Vo and k can be entered using a variety of methods:  Constant: a constant value.  Surface: a surface defining the value at each XY location. The surface must cover the whole area of the velocity zone.  Well TDR constant: The value will be estimated using the time depth relationship (TDR) through the zone for each well and a single constant value used. Petrel employs a minimum depth error method to estimate this value. See Velocity modeling algorithms for a detailed description of the algorithm. The well TDR is defined on the Time tab of the well settings dialog, see Time tab (Well).  Well TDR surface: The value will be estimated using the time depth relationship (TDR) through the zone for each well and interpolated to give a surface describing the variation of the value across the model. Controls for the creation of this surface are found on the Advanced tab. Petrel employs a minimum depth error method to estimate this value. See Velocity modeling algorithms for a detailed description of the algorithm. Here we are going to discuss stacking velocity method to compute velocity model:- Stacking velocities can be loaded into Petrel as “Petrel points with attributes (ASCII) (*.*)”, provided the files contain X, Y coordinate information. Additionally, a few common formats have been added to the loader to support ASCII-Stacking Velocity files where there is only information related to 2D and 3D surveys and the X, Y coordinates are not defined. The corresponding 2D or 3D surveys must be defined in Petrel before attempting to load these types of files. Petrel will try to match the name of the survey and/or the names of the 2D lines. Formats supported are “ESSOV2”, “STATOILH2” and “DISKOS”.Stacking velocities must first be converted using the Dix formula. Initially, import the velocities using the appropriate format, then create average velocities using the Dix conversion found under Points operations in the Operations tab. As a result, new attributes are created for the point data set. Interval and average velocitiy attributes positioned at the mid point between existing values are some of them. These can now be used for various purposes in Petrel.
  • 64. In the Operations Tab, it is possible to extract surface-consistent interval velocity points between a top and a bottom layer. These scatter values can then be edited, visualized and mapped to derive Interval velocity maps. The Interval velocity maps can be assigned in the Make velocity model process to create a velocity model. Alternatively, these points can be sampled into a 3D grid (see Creating a grid for velocity modeling) and further the velocity interpolated using data analysis and petrophysical modeling processes. Once velocities have been created in the 3D grid, they can be used directly as input to the velocity modeling process or average maps created from the velocity properties settings and used as input. Z vs. t or Z vs. V plots can be made directly from the stacking velocities in a function window. A function can then be created to pass through these points and describe the relationship. If a straight line is used, then V0 and k will be reported on the function's settings dialog on the Function tab. How to extract interval velocity points from stacking velocities for depth conversion:- 1. Import the stacking velocities as points with attributes or any of the predefined formats. 2. Use the Dix conversion under seismic operations on the Operations tab of the points settings dialog to create a set of average velocity points. 3. Open the settings for the point data set and go to the Operations tab. 4. Expand the Velocity conversion folder and select Interval velocity calculation (surface,surface). 5. Make sure the average velocity attribute is used and enter the bordering surfaces as top and base elevation. Run the operation. 6. The new datapoint set can be used to grid a surface using the Make/edit surface process. Make sure to use the Interval velocity attribute for gridding. 7. The resulting surface can be used in the Make velocity model process.
  • 66. Figure :- interval velocity surfaces between the surfaces Figure:- the velocity model computed from stacking velocity Once a velocity model has been created, it can be used to depth convert objects.
  • 67. Figure :- time surfaces and it‟s corresponding depth converted surfaces (ix) ADVANCE SEISMIC INTERPRETATION Geobody Interpretation:- A geobody is a 3D object extracted from a seismic volume. A geobody can contain several sub elements (Geoblobs). We can build up our 3D object representation by repeatedly extracting many sub elements. The geobody can be used as input to a property model, surfaces or as a voxel mask for further seismic visualization. Petrel Geobody interpretation (PGI) comes as a package installed on top of Petrel. It uses a state of the art technology to identify, isolate, extract and make discreet an object seen in seismic data. The extracted object is called a geobody. The geobody can then be used as input to property modeling, surface generation, or further seismic visualization tasks. Geobody extraction:- Normally geobody (e.g. buried river channel) are not seen in seismic section very clearly, therefore several types of attributes are apply on seismic section to visualize the interested zone clearly. After identifying the zone of interest, geobody is extracted from seismic section.
  • 68. Figure :- time slice of seismic data Note that from above figure that the signature of buried river channel is not very much clear. If we apply some attributes on it we can see the clear signature of buried river channel due to which the extraction of geobody become very easy. Application of some attributes is shown below in figures:-
  • 69. Figure:- application of evelope attribute Figure:- blending of two attributes (sweetness and envelope attributes) A geobody is extracted based on its opacity threshold value. By default, the threshold value is set to 20%. If a cell has an opacity threshold of less than 20%, it will not be included in the geobody. This value can be edited by double-clicking on the Geobody interpretation process and changing the Voxel connectivity opacity threshold value. This threshold is also visible as a dashed green line in the opacity settings histogram. The steps of geobody extraction are given below:- 1:- after visualizing the interested zone clearly using the several types of attributes we insert a box probe. The box probe is a rectangular volume that can be resized and tilted.
  • 70. Figure:- the box probe 2:- then using the opacity option in the setting of box probe we can only visualize the interested zone and opaque the remaining zone. 3:- now after converting the box probe into the seismic cube we choose the clipping polygon option from function bar and extract the geobody.
  • 71.
  • 72. Figure :- Extracted Geobody Horizon/Saurface probe The horizon probe is an irregular probe that follows one or two horizon interpretations/surfaces. It is also referred to as "sculpting" in the industry. After making the horizon we can change the corresponding horizon into the horizon/surface probe. Figure:- horizon of which corresponding horizon probe is shown below
  • 73. Figure:- Horizon Probe Well probe The well probe is a cylindrical volume following a well path that can be resized and prolonged. Creating a well probe along a proposed well path can be useful for several purposes. The well probe below is using as an Ant Tracking volume to see if there are any fault or fracture areas in the proposed well path. Steps of creating a well probe is as given below:- 1:- first we choose that well along which we have to create the well probe
  • 74. Figure:- selecting wells for making well probe 2:- Now inserting the well probe and changing it‟s opacity we can know the behavior of medium surrounding the well i.e. either there are some type of potential zone or some fault planes along the well path. Figure:- well probe
  • 75. Figure:- well probe before applying the opacity Figure :- well probe after applying the opacity
  • 76. 6:- RESULTS AND DISCUSSION:  Petrel allows us to analyze and visualize different types of data set (e.g. well data, check- shot data, seismic data, well tops (stratigraphic markers), velocity data in a single interface.  Seismic well-tie process helps us to correlate the stratigraphic markers in the seismic section.  After synthetic correlation several regional stratigraphic markers (Caracus, Kobe, Paris, etc.) have been interpreted.  Major structures (mainly faults and salt dome) also have been interpreted to define the structural framework of the area.  Most of the faults are oriented in the east-west direction.  While interpreting the horizons a sharp boundary (salt dome) has been identified, and the major reflectors are terminating against this salt dome, it is visible by seeing the amplitude contrast.  During the interpretation we have generated several structural and stratigraphic attribute to characterize the reservoir property.  Variance and Ant-Track attributes helps us to detect the structural framework of the area. Mainly faults and other discontinuities.  Similarly stratigraphic attributes for e.g. sweetness, relative acoustic impedance and genetic inversion helps us to indentify some key prospective zones.  Same features can be confirmed by generating the surface attribute along the interested surface.  With the available velocity data (SEG Y cube, stacking velocity, check-shot etc.) all the time interpretation have been converted into depth and structural maps of time and depth surfaces have been prepared.  Signature of the river channel in the time slice have been enhanced and visualized in a 3D view by applying several volume attributes and generating the geobody of the river channel.
  • 77. References:- 1:- Bacon M., Simm R. and Redshaw T., 2003, 3D-Seismic Interpretation, Cambridge University Press 2:- Coffeen, J. A. (1984), Interpretin Seismic Data, PennWell Tulsa, 3:- Brouwer J., and Helbig k., 1998 Shallow High-Resolution Reflection seismic, Handbook of Geophysical Exploration, volume 19, Elsevier Science, Amesterdem. 4:- Dobrin M.B. & Savit C.H. (1988), Introduction to geophysical prospecting, McGraw Hills. Inc., New York, USA, pp. 25-430. 5:-Robinson E. S., 1996 , Peacock and Treitel, 1969, Basic Exploration Geophysics. 6:- Sheriff R.E. and Geldart L.P. 1982, Exploration seismology, Cambridge: Cambridge University Press 7:- Ricker, N., 1953, wavelet contraction, wavelet expansion, and the control of seismic resolution: Geophysics, 18, 769-792.