SlideShare uma empresa Scribd logo
1 de 31
Baixar para ler offline
Self-Adaptive Simulated Binary Crossover
     for Real-Parameter Optimization



   Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe*
                     GECCO 2007

                   *Indian Inst. India
              **Honda Research Inst., Japan



                       Reviewed by

                   Paskorn Champrasert
                   paskorn@cs.umb.edu
                   http://dssg.cs.umb.edu
Outline
• Simulated Binary Crossover (SBX Crossover )
       (revisit)
• Problem in SBX
• Self-Adaptive SBX
• Simulation Results
   – Self-Adaptive SBX for Single-OOP
   – Self-Adaptive SBX for Multi-OOP
• Conclusion


 November 7, 08        DSSG Group Meeting       2/31
SBX
            Simulated Binary Crossover
• Simulated Binary Crossover (SBX) was proposed in
  1995 by Deb and Agrawal.

• SBX was designed with respect to the one-point
  crossover properties in binary-coded GA.
    – Average Property:
      The average of the decoded parameter values is the same
      before and after the crossover operation.
    – Spread Factor Property:
      The probability of occurrence of spread factor β ≈ 1 is more
      likely than any other β value.


November 7, 08              DSSG Group Meeting                       3/31
Important Property of One-Point Crossover
Spread factor:
  Spread factor β is defined as the ratio of the spread of
  offspring points to that of the parent points:
                             c1 −c2
                  β=       | p1 −p2 |
                                                               c1   c2

- Contracting Crossover      β<1                         p1              p2
   The offspring points are enclosed by the parent points.

- Expanding Crossover        β>1                    c1                        c2

   The offspring points enclose by the parent points. p1                 p2

- Stationary Crossover      β=1                               c1              c2
    The offspring points are the same as parent points        p1              p2
 November 7, 08              DSSG Group Meeting                               4/31
Spread Factor (β) in SBX
• The probability distribution of β in SBX should be similar (e.g.
   same shape) to the probability distribution of β in Binary-coded
   crossover.
                                  c(β) = 0.5(nc + 1)β nc , β ≤ 1
 • The probability distribution
 function is defined as:           c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                          c

                                                                  1.6
                                                                  1.4




                                       Probability Distribution
                                                                  1.2
                                                                  1.0                   n=2
                                                                  0.8
                                                                  0.6
                                                                  0.4
                                                                  0.2
                                                                  0.0
                                                                        0   1   2   3    4    5   6    7
                                                                                    β



November 7, 08              DSSG Group Meeting                                                        5/31
A large value of n gives a higher
                              probability for creating ‘near-parent’
contracting                   solutions.
   c(β) = 0.5(n + 1)β n , β ≤ 1



                      expanding
                                            1
                       c(β) = 0.5(n + 1) β n+2 , β > 1




                                                 Mostly, n=2 to 5
November 7, 08              DSSG Group Meeting                      6/31
Spread Factor (β) as a random number
          • β is a random number that follows the proposed
            probability distribution function:
                           1.6
                           1.4
                                                                         To get a β value,
Probability Distribution




                           1.2
                                                                         1) A unified random number u
                                                     n=2
                                                                            is generated [0,1]
                           1.0
                           0.8
                           0.6                                           2) Get β value that makes the
                           0.4
                           0.2
                                                                            area under the curve = u
                           0.0
                                     u
                                 0   1       2   3    4    5   6     7
                                                 β                          Offspring:
                                         β
                                                                                    c1 = x − 1 β(p2 − p1 )
                                                                                         ¯ 2
                                                                                    c2 = x + 1 β(p2 − p1 )
                                                                                         ¯ 2
                    November 7, 08                                 DSSG Group Meeting                    7/31
SBX Summary
• Simulated Binary Crossover (SBX) is a real-parameter
  combination operator which is commonly used in
  evolutionary algorithms (EA) to optimization problems.

    – SBX operator uses two parents and creates two offspring.

    – SBX operator uses a parameter called the distribution index
      (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation
      run.
          • If nc is large the offspring solutions are close to the parent solutions
          • If nc is small the offspring solutions are away from the parent solutions

    – The distribution index (nc) has a direct effect in controlling the
      spread of offspring solutions.
 November 7, 08                     DSSG Group Meeting                                  8/31
Research Issues
•     Finding the distribution index (nc) to an appropriate value
      is an important task.
    –      The distribution index (nc) has effect in
          1) Convergence speed
                if nc is very high the offspring solutions are very close to the parent solutions the
                     convergence speed is very low.
          2) Local/Global optimum solutions.
             Past studies of SBX crossover GA found that it cannot work well in
             complicated problems (e.g., Restrigin’s function)




    November 7, 08                         DSSG Group Meeting                                     9/31
Self-Adaptive SBX
• This paper proposed a procedure for updating the
  distribution index (nc) adaptively to solve optimization
  problems.

    – The distribution index (nc) is updated every generation.

    – How the distribution index of the next generation (nc) is
      updated (increased or decreased) depends on how the
      offspring outperform (has better fitness value) than its two
      parents.
• The next slides will show the process to make an
  equation to update the distribution index.

November 7, 08               DSSG Group Meeting                      10/31
Probability density per offspring
         Spread factor distribution
                           1.6
                           1.4
                                                                                   (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0                             nc = 2
                                                                                   (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                               c

                           0.8
                           0.6                                                                              ¯
                                                                                An example, when p1=2, p2=5 β>1
                           0.4
                           0.2
                           0.0             u
                                 0         1       2   3      4     5   6   7
                                                       β
                                               β                                                             (1)   (2)

         Offspring:
                                  ¯ 2¯
                             c1 = x − 1 β(p2 − p1 )                                              (1)   (2)

                                  ¯ 2¯
                             c2 = x + 1 β(p2 − p1 )



                            November 7, 08                                  DSSG Group Meeting                           11/31
Self-Adaptive SBX (1)
Consider the case that β > 1:

                                 c1                     c2
                     β>1
                                      p1               p2

                    c2 −c1
     β=           | p2 −p1 |
             (c2 −p2 )+(p2 −p1 )+(p1 −c1 )
    β=                  p2 −p1

    (c2 − p2 ) = (p1 − c1 )

                    2(c2 −p2 )
     β =1+           p2 −p1
                                  --- (1)

 November 7, 08                   DSSG Group Meeting         12/31
Self-Adaptive SBX (2)
                           1.6
                           1.4
                                                                                 (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0
                                                                                 (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                             c

                           0.8
                           0.6
                           0.4
                           0.2
                           0.0
                                           u
                                 0         1       2   3      4   5   6   7
                                                       β
                                               β
                                                       (2)
                                                           Rβ  0.5(nc +1)
                                                                                = (u − 0.5)           --- (2)
                                                             1   β nc +2 dβ




                            November 7, 08                                DSSG Group Meeting                        13/31
(2)
                   Self-Adaptive SBX (3)
 Rβ  0.5(nc +1)
                                            --- (2)
  1    β nc +2 dβ = (u − 0.5)
Rβ
 1
    0.5(nc + 1)β −(nc +2) dβ =            (u − 0.5)

−0.5β −nc −1 − (−0.5) = (u − 0.5)
−0.5β −nc −1 = (u − 1)
log(β −nc −1 ) = log − (u−1)
                        0.5

(−nc − 1) log β = log(−2(u − 1))
                   log 2(1−u)
(−nc − 1) =           log β

                         log 2(1−u)
            nc = −(1 +      log β   )                --- (3)
  November 7, 08                DSSG Group Meeting             14/31
Self-Adaptive SBX (4)
                 c1                c2    c2
                                         ˆ
β>1
                      p1      p2

• If c2 is better than both parent solutions,
    – the authors assume that the region in which the child
      solutions is created is better than the region in which
      the parents solutions are.
    – the authors intend to extend the child solution further
      away from the closet parent solution (p2) by a factor α
      (α >1)


                 (c2 − p2 ) = α(c2 − p2 )
                  ˆ                                --- (4)


November 7, 08                DSSG Group Meeting             15/31
β Update
                 c1                    c2   c2
                                            ˆ
β>1                   p1          p2

                 2(c2 −p2 )
β =1+             p2 −p1
                                --- (1)      (c2 − p2) = α(c2 − p2) --- (4)
                                              ˆ


from (1)         ˆ
                 β =1+        2(c2 −p2 )
                                ˆ
                               p2 −p1

                              2((α(c2 −p2 )+p2 )−p2 )
                   =1+                p2 −p1
                              α2(c2 −p2 )
                   =1+          p2 −p1
                                                   β−1
  ˆ
  β = 1 + α(β − 1) --- (5)
November 7, 08                    DSSG Group Meeting                      16/31
Distribution Index (nc) Update
nc = −(1 +         log 2(1−u)
                              )   --- (3)        ˆ
                                                 β = 1 +α(β −1) --- (5)
                      log β

                   log 2(1−u)
nc = −(1 +
ˆ                         ˆ
                      log β
                              )

                     log 2(1−u)
nc = −(1 +
ˆ                  log(1+α(β−1)) )


                  log 2(1 − u) = −(nc + 1) log β

                                    (nc +1) log β
                  nc = −1 +
                  ˆ               log(1+α(β−1))        --- (6)



 November 7, 08                   DSSG Group Meeting                 17/31
Distribution Index (nc) Update




When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (6). The distribution index will be
decreased in order to create an offspring (c’) further away from its
parent(p2) than the offspring (c) in the previous generation.
                    (nc +1) log β
 nc = −1 +
 ˆ                log(1+α(β−1))        --- (6)       [0,50]
                           α is a constant (α >1 )

 November 7, 08               DSSG Group Meeting                       18/31
Distribution Index (nc) Update
When c is worse than p1 and p2 then the distribution index for the
next generation is calculated as (7). The distribution index will be
increased in order to create an offspring (c’) closer to its
parent(p2) than the offspring (c) in the previous generation.

The 1/α is used instead of α

                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+α(β−1))           --- (6)



                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+(β−1)/α)          --- (7)


                                                    α is a constant (α >1 )
 November 7, 08                DSSG Group Meeting                             19/31
Distribution Index (nc) Update
         So, we are done with the expanding case                                                   (2)


                              How about contracting case                               (1)     ?
                                                        The same process is applied for                  (1)


                            1.6
                            1.4
                                                                                         (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                            1.2
                            1.0
                                                                                         (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                                     c

                            0.8   (1)         (2)
                            0.6
                            0.4
                            0.2
                            0.0
                                      u
                                  0       1         2     3   4         5    6     7
                                                          β
                                        β                         (1)
                                                                   Rβ
                                                                        0
                                                                            0.5(nc + 1)β nc dβ = u                   --- (x)
                           November 7, 08                                        DSSG Group Meeting                            20/31
Distribution Index (nc) Update
                         for contracting case
When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (8). The distribution index will be
decreased in order to create an offspring (c’) father away from its
parent(p2) than the offspring (c) in the previous generation.

                  1+nc                   --- (8)
       nc =
       ˆ           α     −1


 When c is worse than p1 and p2, 1/α is used instead of α

        nc = α(1 + nc ) − 1
        ˆ                                  --- (9)

                                                   α is a constant (α >1 )
 November 7, 08               DSSG Group Meeting                             21/31
Distribution Index Update Summary

                  Expanding Case                      Contracting Case
                     (u > 0.5)                            (u<0.5)
c is better than parents
                                                              1+nc
           nc = −1 +
           ˆ                 (nc +1) log β             ˆ
                                                       nc =    α     −1
                           log(1+α(β−1))


c is worse than parents
                             (nc +1) log β
          nc = −1 +
          ˆ                log(1+(β−1)/α)
                                                      nc =α(1+nc) −1
                                                      ˆ


  (u = 0.5), nc = nc
             ˆ
 November 7, 08                  DSSG Group Meeting                       22/31
Self-SBX Main Loop

                                                             Update nc


                                                         If rand < pc

                                              Parent 1                              Offspring C1
          Pt
                             Selection                             Crossover
        initial                               Parent 2                              Offspring C2
      population


                                                    If rand > pc
                   N = 150                                          Mutation
                   nc = 2
                   α = 1.5




                                                                      Pt+1
                                                                   Population
                                                                     at t+1


                                                                                N = 150
                                                                                α = 1.5
November 7, 08                           DSSG Group Meeting                                        23/31
Simulation Results
Sphere Problem:

 SBX
 • Initial nc is 2.
 • Population size =150
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5



November 7, 08                 DSSG Group Meeting                        24/31
Fitness Value

                                   • Self-SBX can
                                     find better
                                     solutions




November 7, 08      DSSG Group Meeting              25/31
α Study
                         • α is varied in [1.05,2]
                         • The effect of α is not
                           significant since the
                           average number of
                           function evaluations is
                           similar.




November 7, 08   DSSG Group Meeting              26/31
Simulation Results
Rastrigin’s Function:

 Many local optima, one global minimum (0)
 The number of variables = 20
 SBX
 • Initial nc is 2.
 • Population size =100
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001 or the maximum
   of 40,000 generations is reached.
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5


November 7, 08                               DSSG Group Meeting                   27/31
Fitness Value



                       Best        Median   Worst



• Self-SBX can find a very close global optimal




November 7, 08      DSSG Group Meeting              28/31
α Study
                              • α is varied in
                                [1.05,10]
                              • The effect of α is
                                not significant since
                                the average number
                                of function
                                evaluations is
                                similar.
                              • α = 3 is the best

November 7, 08   DSSG Group Meeting                29/31
Self-SBX in MOOP
                       • Self-SBX is applied in
                         NSGA2-II.
                       • Larger hyper-volume is
                         better




November 7, 08        DSSG Group Meeting          30/31
Conclusion
• Self-SBX is proposed in this paper.
    – The distribution index is adjust adaptively on the run
      time.
• The simulation results show that Self-SBX can
  find the better solutions in single objective
  optimization problems and multi-objective
  optimization problems.

• However, α is used to tune up the distribution
  index.

November 7, 08           DSSG Group Meeting                31/31

Mais conteúdo relacionado

Mais procurados

Đại số boolean và mạch logic
Đại số boolean và mạch logicĐại số boolean và mạch logic
Đại số boolean và mạch logicwww. mientayvn.com
 
Imitation learning tutorial
Imitation learning tutorialImitation learning tutorial
Imitation learning tutorialYisong Yue
 
Graph Neural Network in practice
Graph Neural Network in practiceGraph Neural Network in practice
Graph Neural Network in practicetuxette
 
Chương 1 dai so truu tuong
Chương 1 dai so truu tuongChương 1 dai so truu tuong
Chương 1 dai so truu tuongvpmity
 
Lec03 04-time complexity
Lec03 04-time complexityLec03 04-time complexity
Lec03 04-time complexityAbbas Ali
 
Deformable Convolutional Network (2017)
Deformable Convolutional Network (2017)Deformable Convolutional Network (2017)
Deformable Convolutional Network (2017)Terry Taewoong Um
 
Deep Object Detectors #1 (~2016.6)
Deep Object Detectors #1 (~2016.6)Deep Object Detectors #1 (~2016.6)
Deep Object Detectors #1 (~2016.6)Ildoo Kim
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimizationAbhishek Agrawal
 
Recursion tree method
Recursion tree methodRecursion tree method
Recursion tree methodRajendran
 
Phương pháp số và lập trình - Nội suy, Đạo hàm, Tích phân
Phương pháp số và lập trình - Nội suy, Đạo hàm, Tích phânPhương pháp số và lập trình - Nội suy, Đạo hàm, Tích phân
Phương pháp số và lập trình - Nội suy, Đạo hàm, Tích phânHajunior9x
 
Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"Ra'Fat Al-Msie'deen
 

Mais procurados (20)

Đại số boolean và mạch logic
Đại số boolean và mạch logicĐại số boolean và mạch logic
Đại số boolean và mạch logic
 
Imitation learning tutorial
Imitation learning tutorialImitation learning tutorial
Imitation learning tutorial
 
Graph Neural Network in practice
Graph Neural Network in practiceGraph Neural Network in practice
Graph Neural Network in practice
 
Chương 1 dai so truu tuong
Chương 1 dai so truu tuongChương 1 dai so truu tuong
Chương 1 dai so truu tuong
 
Bat Algorithm
Bat AlgorithmBat Algorithm
Bat Algorithm
 
Lec03 04-time complexity
Lec03 04-time complexityLec03 04-time complexity
Lec03 04-time complexity
 
Deformable Convolutional Network (2017)
Deformable Convolutional Network (2017)Deformable Convolutional Network (2017)
Deformable Convolutional Network (2017)
 
Bai tap java
Bai tap javaBai tap java
Bai tap java
 
Asymptotic Notation
Asymptotic NotationAsymptotic Notation
Asymptotic Notation
 
Deep Object Detectors #1 (~2016.6)
Deep Object Detectors #1 (~2016.6)Deep Object Detectors #1 (~2016.6)
Deep Object Detectors #1 (~2016.6)
 
8 queens problem.pptx
8 queens problem.pptx8 queens problem.pptx
8 queens problem.pptx
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimization
 
Recursion tree method
Recursion tree methodRecursion tree method
Recursion tree method
 
Asymptotic notation
Asymptotic notationAsymptotic notation
Asymptotic notation
 
Chuong03
Chuong03Chuong03
Chuong03
 
Mask R-CNN
Mask R-CNNMask R-CNN
Mask R-CNN
 
Phương pháp số và lập trình - Nội suy, Đạo hàm, Tích phân
Phương pháp số và lập trình - Nội suy, Đạo hàm, Tích phânPhương pháp số và lập trình - Nội suy, Đạo hàm, Tích phân
Phương pháp số và lập trình - Nội suy, Đạo hàm, Tích phân
 
Debai table1
Debai table1Debai table1
Debai table1
 
Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"
 
Li̇neer cebi̇r 01
Li̇neer cebi̇r 01Li̇neer cebi̇r 01
Li̇neer cebi̇r 01
 

Semelhante a Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functionsErik Tjersland
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Matthew Lease
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Martin Pelikan
 
Lecture6
Lecture6Lecture6
Lecture6voracle
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notesmbetzel
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoTakeru INOUE
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equationsErik Tjersland
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Jim Jenkins
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksMark Chang
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpadMedia4math
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering CS, NcState
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Dataswissnex San Francisco
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKedadk
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewtaeseon ryu
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Matthew Leingang
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognitionIntel Nervana
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer TransformIain Richardson
 

Semelhante a Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization (20)

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functions
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses
 
Lecture6
Lecture6Lecture6
Lecture6
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notes
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s Dynamo
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equations
 
Presentation
PresentationPresentation
Presentation
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
 
6.2 notes
6.2 notes6.2 notes
6.2 notes
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Data
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEK
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper review
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)
 
1.2.4
1.2.41.2.4
1.2.4
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognition
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer Transform
 

Último

Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilV3cube
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 

Último (20)

Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 

Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization

  • 1. Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe* GECCO 2007 *Indian Inst. India **Honda Research Inst., Japan Reviewed by Paskorn Champrasert paskorn@cs.umb.edu http://dssg.cs.umb.edu
  • 2. Outline • Simulated Binary Crossover (SBX Crossover ) (revisit) • Problem in SBX • Self-Adaptive SBX • Simulation Results – Self-Adaptive SBX for Single-OOP – Self-Adaptive SBX for Multi-OOP • Conclusion November 7, 08 DSSG Group Meeting 2/31
  • 3. SBX Simulated Binary Crossover • Simulated Binary Crossover (SBX) was proposed in 1995 by Deb and Agrawal. • SBX was designed with respect to the one-point crossover properties in binary-coded GA. – Average Property: The average of the decoded parameter values is the same before and after the crossover operation. – Spread Factor Property: The probability of occurrence of spread factor β ≈ 1 is more likely than any other β value. November 7, 08 DSSG Group Meeting 3/31
  • 4. Important Property of One-Point Crossover Spread factor: Spread factor β is defined as the ratio of the spread of offspring points to that of the parent points: c1 −c2 β= | p1 −p2 | c1 c2 - Contracting Crossover β<1 p1 p2 The offspring points are enclosed by the parent points. - Expanding Crossover β>1 c1 c2 The offspring points enclose by the parent points. p1 p2 - Stationary Crossover β=1 c1 c2 The offspring points are the same as parent points p1 p2 November 7, 08 DSSG Group Meeting 4/31
  • 5. Spread Factor (β) in SBX • The probability distribution of β in SBX should be similar (e.g. same shape) to the probability distribution of β in Binary-coded crossover. c(β) = 0.5(nc + 1)β nc , β ≤ 1 • The probability distribution function is defined as: c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 1.6 1.4 Probability Distribution 1.2 1.0 n=2 0.8 0.6 0.4 0.2 0.0 0 1 2 3 4 5 6 7 β November 7, 08 DSSG Group Meeting 5/31
  • 6. A large value of n gives a higher probability for creating ‘near-parent’ contracting solutions. c(β) = 0.5(n + 1)β n , β ≤ 1 expanding 1 c(β) = 0.5(n + 1) β n+2 , β > 1 Mostly, n=2 to 5 November 7, 08 DSSG Group Meeting 6/31
  • 7. Spread Factor (β) as a random number • β is a random number that follows the proposed probability distribution function: 1.6 1.4 To get a β value, Probability Distribution 1.2 1) A unified random number u n=2 is generated [0,1] 1.0 0.8 0.6 2) Get β value that makes the 0.4 0.2 area under the curve = u 0.0 u 0 1 2 3 4 5 6 7 β Offspring: β c1 = x − 1 β(p2 − p1 ) ¯ 2 c2 = x + 1 β(p2 − p1 ) ¯ 2 November 7, 08 DSSG Group Meeting 7/31
  • 8. SBX Summary • Simulated Binary Crossover (SBX) is a real-parameter combination operator which is commonly used in evolutionary algorithms (EA) to optimization problems. – SBX operator uses two parents and creates two offspring. – SBX operator uses a parameter called the distribution index (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation run. • If nc is large the offspring solutions are close to the parent solutions • If nc is small the offspring solutions are away from the parent solutions – The distribution index (nc) has a direct effect in controlling the spread of offspring solutions. November 7, 08 DSSG Group Meeting 8/31
  • 9. Research Issues • Finding the distribution index (nc) to an appropriate value is an important task. – The distribution index (nc) has effect in 1) Convergence speed if nc is very high the offspring solutions are very close to the parent solutions the convergence speed is very low. 2) Local/Global optimum solutions. Past studies of SBX crossover GA found that it cannot work well in complicated problems (e.g., Restrigin’s function) November 7, 08 DSSG Group Meeting 9/31
  • 10. Self-Adaptive SBX • This paper proposed a procedure for updating the distribution index (nc) adaptively to solve optimization problems. – The distribution index (nc) is updated every generation. – How the distribution index of the next generation (nc) is updated (increased or decreased) depends on how the offspring outperform (has better fitness value) than its two parents. • The next slides will show the process to make an equation to update the distribution index. November 7, 08 DSSG Group Meeting 10/31
  • 11. Probability density per offspring Spread factor distribution 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 nc = 2 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 ¯ An example, when p1=2, p2=5 β>1 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) (2) Offspring: ¯ 2¯ c1 = x − 1 β(p2 − p1 ) (1) (2) ¯ 2¯ c2 = x + 1 β(p2 − p1 ) November 7, 08 DSSG Group Meeting 11/31
  • 12. Self-Adaptive SBX (1) Consider the case that β > 1: c1 c2 β>1 p1 p2 c2 −c1 β= | p2 −p1 | (c2 −p2 )+(p2 −p1 )+(p1 −c1 ) β= p2 −p1 (c2 − p2 ) = (p1 − c1 ) 2(c2 −p2 ) β =1+ p2 −p1 --- (1) November 7, 08 DSSG Group Meeting 12/31
  • 13. Self-Adaptive SBX (2) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (2) Rβ 0.5(nc +1) = (u − 0.5) --- (2) 1 β nc +2 dβ November 7, 08 DSSG Group Meeting 13/31
  • 14. (2) Self-Adaptive SBX (3) Rβ 0.5(nc +1) --- (2) 1 β nc +2 dβ = (u − 0.5) Rβ 1 0.5(nc + 1)β −(nc +2) dβ = (u − 0.5) −0.5β −nc −1 − (−0.5) = (u − 0.5) −0.5β −nc −1 = (u − 1) log(β −nc −1 ) = log − (u−1) 0.5 (−nc − 1) log β = log(−2(u − 1)) log 2(1−u) (−nc − 1) = log β log 2(1−u) nc = −(1 + log β ) --- (3) November 7, 08 DSSG Group Meeting 14/31
  • 15. Self-Adaptive SBX (4) c1 c2 c2 ˆ β>1 p1 p2 • If c2 is better than both parent solutions, – the authors assume that the region in which the child solutions is created is better than the region in which the parents solutions are. – the authors intend to extend the child solution further away from the closet parent solution (p2) by a factor α (α >1) (c2 − p2 ) = α(c2 − p2 ) ˆ --- (4) November 7, 08 DSSG Group Meeting 15/31
  • 16. β Update c1 c2 c2 ˆ β>1 p1 p2 2(c2 −p2 ) β =1+ p2 −p1 --- (1) (c2 − p2) = α(c2 − p2) --- (4) ˆ from (1) ˆ β =1+ 2(c2 −p2 ) ˆ p2 −p1 2((α(c2 −p2 )+p2 )−p2 ) =1+ p2 −p1 α2(c2 −p2 ) =1+ p2 −p1 β−1 ˆ β = 1 + α(β − 1) --- (5) November 7, 08 DSSG Group Meeting 16/31
  • 17. Distribution Index (nc) Update nc = −(1 + log 2(1−u) ) --- (3) ˆ β = 1 +α(β −1) --- (5) log β log 2(1−u) nc = −(1 + ˆ ˆ log β ) log 2(1−u) nc = −(1 + ˆ log(1+α(β−1)) ) log 2(1 − u) = −(nc + 1) log β (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) November 7, 08 DSSG Group Meeting 17/31
  • 18. Distribution Index (nc) Update When c is better than p1 and p2 then the distribution index for the next generation is calculated as (6). The distribution index will be decreased in order to create an offspring (c’) further away from its parent(p2) than the offspring (c) in the previous generation. (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) [0,50] α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 18/31
  • 19. Distribution Index (nc) Update When c is worse than p1 and p2 then the distribution index for the next generation is calculated as (7). The distribution index will be increased in order to create an offspring (c’) closer to its parent(p2) than the offspring (c) in the previous generation. The 1/α is used instead of α (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) --- (7) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 19/31
  • 20. Distribution Index (nc) Update So, we are done with the expanding case (2) How about contracting case (1) ? The same process is applied for (1) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 (1) (2) 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) Rβ 0 0.5(nc + 1)β nc dβ = u --- (x) November 7, 08 DSSG Group Meeting 20/31
  • 21. Distribution Index (nc) Update for contracting case When c is better than p1 and p2 then the distribution index for the next generation is calculated as (8). The distribution index will be decreased in order to create an offspring (c’) father away from its parent(p2) than the offspring (c) in the previous generation. 1+nc --- (8) nc = ˆ α −1 When c is worse than p1 and p2, 1/α is used instead of α nc = α(1 + nc ) − 1 ˆ --- (9) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 21/31
  • 22. Distribution Index Update Summary Expanding Case Contracting Case (u > 0.5) (u<0.5) c is better than parents 1+nc nc = −1 + ˆ (nc +1) log β ˆ nc = α −1 log(1+α(β−1)) c is worse than parents (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) nc =α(1+nc) −1 ˆ (u = 0.5), nc = nc ˆ November 7, 08 DSSG Group Meeting 22/31
  • 23. Self-SBX Main Loop Update nc If rand < pc Parent 1 Offspring C1 Pt Selection Crossover initial Parent 2 Offspring C2 population If rand > pc N = 150 Mutation nc = 2 α = 1.5 Pt+1 Population at t+1 N = 150 α = 1.5 November 7, 08 DSSG Group Meeting 23/31
  • 24. Simulation Results Sphere Problem: SBX • Initial nc is 2. • Population size =150 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 24/31
  • 25. Fitness Value • Self-SBX can find better solutions November 7, 08 DSSG Group Meeting 25/31
  • 26. α Study • α is varied in [1.05,2] • The effect of α is not significant since the average number of function evaluations is similar. November 7, 08 DSSG Group Meeting 26/31
  • 27. Simulation Results Rastrigin’s Function: Many local optima, one global minimum (0) The number of variables = 20 SBX • Initial nc is 2. • Population size =100 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 or the maximum of 40,000 generations is reached. • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 27/31
  • 28. Fitness Value Best Median Worst • Self-SBX can find a very close global optimal November 7, 08 DSSG Group Meeting 28/31
  • 29. α Study • α is varied in [1.05,10] • The effect of α is not significant since the average number of function evaluations is similar. • α = 3 is the best November 7, 08 DSSG Group Meeting 29/31
  • 30. Self-SBX in MOOP • Self-SBX is applied in NSGA2-II. • Larger hyper-volume is better November 7, 08 DSSG Group Meeting 30/31
  • 31. Conclusion • Self-SBX is proposed in this paper. – The distribution index is adjust adaptively on the run time. • The simulation results show that Self-SBX can find the better solutions in single objective optimization problems and multi-objective optimization problems. • However, α is used to tune up the distribution index. November 7, 08 DSSG Group Meeting 31/31