2. Introduction to seismic inversion
• Generally most seismic/petro-physical inversion techniques are model based inversions.
Misfit
• They start with an initial guess model.
• Then, Compute the synthetic response for the guess model.
Objective: search for minimum misfit model
• Compare it with the observed data to evaluate misfit between the observed and synthetic.
Different models give different acceptable misfit.
• Keep on doing this till they find a model which gives the least ormisfit
• Thus, these inversion algorithms are mainly search algorithms which search for minimum misfit model.
3. Simulated Annealing is one of the search
algorithms…!
barrier to local search
Misfit
starting
point
descend
direction
local minima
global minima
• Local search techniques, such as steepest descend method, are very good in finding local minima.
• However, difficulties arise when the global minima is different from the local minima.
• Since all the immediate neighboring points around a local minima have worse misfit than it, local search can
not proceed once trapped in a local minima point.
• We need some mechanism that can help us escape the trap of local minima. And the simulated annealing is
one of such methods.
4. Simulated Annealing Process
• The name of simulated annealing origins from the simulation of annealing process of heated solids.
• Annealing denotes a physical process in which a solid in a heat bath is heated up by increasing the
temperature of the heat bath to a maximum value at which all particles of the solid randomly arrange
themselves in the liquid phase, followed by cooling through slowly lowering the temperature of the heat
bath.
• In this way, all particles arrange themselves in the low energy ground state of a corresponding lattice.
• In global optimization problems, we make an analogy to the aforementioned process.
• The basic idea is that by allowing the search process to proceed in an unfavorable direction occasionally, we
might be able to escape the trap of local minima and reach the global minima.
5. Simulated Annealing Algorithm
1. ‘Misfit’= (Observed-Synthetic)2
2. Let ‘Misfit’ be an Evaluating function to evaluate quality of a model
barrier to local search
3. Lesser the misfit better the model
starting
4. Evaluate Misfit for starting model
point
Probability expression
from annealing will help
5. Now choose a neighboring point
jump this barrier
6. Evaluate Misfit for this neighboring model
local minima
7. Accept this model with some Probability (P)
8. Expression of Probability (P) for accepting a solution is derived from the process of Annealing.
This probabilistic approach will allow us to accept a neighboring ‘bad’ model (ie. with greater misfit) and hence
escape one valley of local minimum
6. Algorithm-Details
• Probability of accepting a neighboring model is given by: P =
• Value of max(T) is changed with each iteration.
∆ is Change in misfit as we
T – a control parameter
go from initial model to
analogous to ‘Temperature’ in
neighboring model which
Thermodynamics equation of
is analogous to change in
Annealing
energy in Annealing
Now cooling is large, say 1010
• Initially T is verydone ie. Value of T is reduced slowly to 0
•
•
•
•
If new objective misfit > old objective misfit, then ∆>0
When T reaches 0
Exponential term becomes exp(1/infinity) = 1
Exponential
Thus, P = ½ term becomes exp(∆/0) = (infinity) if ∆>0 & -(infinity) if ∆<0
•
•
•
•
Hence, There is a probability (of ½) that neighboring ‘bad’ model giving a higher misfit is accepted.
Thus, P = 0 when ∆>0 and P=1 when ∆<0
This will enable us escape the ‘local barrier’ as discussed earlier.
Therefore, only ‘good’ models with lesser misfit will be accepted now.
7. Algorithm details…
• This algorithm helps to search for a global minimum misfit giving model.
• It initially searches for global minima by jumping valleys, but later (when T=0) gets trapped in the valley with
global minima.
desired effect
• But it is a double edged sword.
Help escaping the
local minima.
adverse effect
Might pass global minima
after reaching it
8. Algorithm- Advantages and Disadvantages
A. Strengths
• can deal with highly nonlinear models, chaotic and noisy data and many constraints.
• is a robust and general technique.
• main advantages over other local search methods are its flexibility and its ability to approach global optimality.
• is quite versatile since it does not rely on any restrictive properties of the model
B. Weaknesses:
• a lot of choices are required to turn it into an actual algorithm.
• there is a clear tradeoff between the quality of the solutions and the time required to compute them.
• delicate tailoring work is required to account for different classes of constraints and to fine-tune the parameters of
the algorithm.
• the precision of the numbers used in implementation can have a significant effect upon the quality of the outcome.