SlideShare uma empresa Scribd logo
1 de 35
Baixar para ler offline
A vne
 da cd
Ifr t nT e r i
nomai h oyn
     o
C P “ aN t e”
 VRi n us l
          hl
                                      CP
                                       VR
  T ti
   u rl
    oa                              J n 1 -82 1
                                     u e 31 0 0
                                    S nFa c c ,A
                                     a rn i oC
                                           s
Shape Matching with I-divergences

Anand Rangarajan
Shape Matching with I-Divergences



Groupwise Point-set Pattern Registration
Given N point-sets, which are denoted by {X p , p ∈ {1, ..., N}}, the
task of multiple point pattern matching or point-set registration is to
recover the spatial transformations which yield the best alignment of
all shapes.




                                                                          2/29
Problem Visualization




                        3/29
Problem Visualization




                        3/29
Group-wise Point-set Registration



Principal Technical Challenges
    Solving for nonrigid deformations between point-sets with
    unknown correspondence is a difficult problem.

    How do we align all the point-sets in a symmetric manner so
    that there is no bias toward any particular point-set?




                                                                  4/29
From point-sets to density functions




                                       5/29
From point-sets to density functions




                                       5/29
Group-wise Point-set Registration



From point-sets to density functions
    Point sets are represented by probability density functions.
    Intuitively, if these point sets are aligned properly, the
    corresponding density functions should be similar.




                                                                   6/29
Group-wise Point-set Registration



From point-sets to density functions
    Point sets are represented by probability density functions.
    Intuitively, if these point sets are aligned properly, the
    corresponding density functions should be similar.
Question: How do we measure the similarity between multiple
density functions?




                                                                   6/29
Divergence Measures



Kullback-Leibler divergence
                ˆ
                               p(x)
  DKL (p q) =       p(x) log        dx
                               q(x)

where p(x), q(x) are the probability
density functions.




                                                               7/29
Divergence Measures



Kullback-Leibler divergence            J divergence
                ˆ                      Given two probability density
                             p(x)      function p and q, the symmetric KL
  DKL (p q) =       p(x) log      dx
                             q(x)      divergence is defined as:

where p(x), q(x) are the probability            1
                                       J(p, q) = (DKL (p q) + DKL (q p))
density functions.                              2




                                                                       7/29
Motivating the JS divergence

                                     Modeling two shapes




                                         X                                    Y


                    N1         K1                                      N2         K2
         (1)              1                 (1)             (2)              1                (2)
p(X |θ         )=                    p(Xi |θa ),   p(Y |θ         )=                    p(Yj |θb )
                          K1                                                 K2
                    i=1        a=1                                     j=1        b=1

                                                                                                     8/29
Motivating the JS divergence


Modeling the overlay of two shapes with identity of origin




                                 X Y



        p(X ∪ Y |θ(1) , θ(2) ) = p(X |θ(1) )p(Y |θ(2) )


                                                             8/29
Motivating the JS divergence


Modeling the overlay of two shapes without identity of origin




                                       Z

                           N1                    N2
  p(Z |θ(1) , θ(2) ) =           p(Z |θ(1) ) +         p(Z |θ(2) )
                         N1 + N2               N1 + N2


                                                                     8/29
Likelihood Ratio

Which generative model do you prefer? The union of disparate
shapes where identity of origin is preserved or one combined
shape where the identity of origin is suppressed.
Likelihood ratio:
                                         N1         (1)      N2       (2)
              p(Z |θ(1) , θ(2) )       N1 +N2 p(Z |θ ) + N1 +N2 p(Z |θ )
log Λ = log                        =
            p(X ∪ Y |θ(1) , θ(2) )             p(X |θ(1) )p(Y |θ(2) )

Z is understood to arise from a convex combination of two
mixture models p(Z |θ(1) ) and p(Z |θ(2) ) where the weights of
each mixture are proportional to the number of points N1 and
N2 in each set.
Weak law of large numbers leads to Jensen-Shannon divergence.
                                                                    9/29
JS Divergence for multiple shapes



JS-divergence of shape densities
          JSπ (P1 , P2 , ..., Pn ) = H(    π i Pi ) −   πi H(Pi )      (1)
where π = {π1 , π2 , ..., πn |πi > 0, πi = 1} are the weights of the
probability densities Pi and H(Pi ) is the Shannon entropy.




                                                                         10/29
Atlas estimation


Formulation using JS-divergence

                                               N
                JSβ (P1 , P2 , ..., PN ) + λ         ||Lf i ||2
                                               i=1
                                                           N
             =H(       β i Pi ) −     βi H(Pi ) + λ               ||Lf i ||2 .
                                                          i=1

f i is the deformation function corresponding to point set X i ;
Pi = p(f i (X i )) is the probability density for deformed point-set.


                                                                                 11/29
Multiple shapes: JS divergence

JS divergence in a hypothesis testing framework:
    Construct a likelihood ratio between i.i.d. samples drawn from a
    mixture ( a πa Pa ) and i.i.d. samples drawn from a
    heterogeneous collection of densities (P1 , P2 , ..., PN ).

    The likelihood ratio is then
                              M      N
                              k=1    a=1 πa Pa (xk )
                      Λ=       N     Na
                                                     .
                                                a
                               a=1   ka =1 Pa (xka )

    Weak law of large numbers gives us the JS-divergence.

                                                                   12/29
Group-wise Registration Results
Experimental results on four 3D hippocampus point sets.




                                                          13/29
Shape matching via CDF I-divergences



Model each point-set by a cumulative distribution function
(CDF)
Quantify the distance among cdfs via an information-theoretic
measure [typically the cumulative residual entropy (CRE)]
Minimize the dis-similarity measure over the space of
coordinate transformation parameters




                                                                14/29
Havrda-Charvát CRE


HC-CRE: Let X be a random vector in R d , we define the HC-CRE
of X by
                 ˆ
     EH (X ) = −   (α − 1)−1 (P α (|X | > λ) − P(|X | > λ))d λ
                       d
                      R+

where X = {x1 , x2 , . . . , xd }, λ = {λ1 , λ2 , . . . , λd }, and |X | > λ
means |xi | > λi , R+ = {xi ∈ R d ; xi ≥ 0; i ∈ {1, 2, . . . , d }}.
                    d




                                                                               15/29
CDF-HC Divergence



CDF-HC Divergence : Given N cumulative probability distributions
Pk , k ∈ {1, . . . , N}, the CDF-JS divergence of the set {Pk } is
defined as

       HC (P1 , P2 , . . . , PN ) = EH (       πk Pk ) −       πk EH (Pk )
                                           k               k

where 0 ≤ πk ≤ 1,       k   πk = 1, and EH is the HC-CRE.




                                                                             16/29
CDF-HC Divergence


Let P =       k   πk Pk

     HC (P1 , P2 , . . . , PN )
                    ˆ                                        ˆ
              −1
   = −(α − 1) (            P α (X > λ)d λ−              πk         α
                                                                  Pk (Xk > λ)d λ)
                              d
                             R+                               d
                                                             R+
                                                    k
                   ˆ                         ˆ
                           2
   =          πk          Pk (Xk > λ)d λ −        P 2 (X > λ)d λ      (α = 2)
                     d
                    R+                        d
                                             R+
          k




                                                                                    17/29
Dirac Mixture Model

                                                                            Dk
                                                                       1
                                                         Pk (Xk > λ) =           H i (x, xi )
                                                                       Dk
                                                                            i

where H(x, xi ) is the Heaviside function (equal to 1 if all
components of x are greater than xi ).


   1
  0.5
    0

        0

            10

                 20

                      30

                           40
                                                                80
                                50                         60
                                     60             40
                                               20
                                          0
                                          70




                                                                                                18/29
CDF-JS, PDF-JS & CDF-HC

          Before Registraion                          CDF−JS                             PDF−JS                             CDF−HC
3.5                                     3.5                                3.5                                3.5

 3                                       3                                  3                                  3

2.5                                     2.5                                2.5                                2.5

 2                                       2                                  2                                  2

1.5                                     1.5                                1.5                                1.5

 1                                       1                                  1                                  1

0.5                                     0.5                                0.5                                0.5

 0                                       0                                  0                                  0

      0           1             2             0         1          2             0         1          2             0         1          2

           Before Registraion                         CDF−JS                             PDF−JS                             CDF−HC
  4                                       4                                  4                                  4


  2                                       2                                  2                                  2


  0                                       0                                  0                                  0
      0      2     4     6          8         0   2     4      6       8         0   2     4      6       8         0   2     4      6       8




                                                                                                                                                 19/29
2D Point-set Registration for CC

           Point Set 1                    Point Set 2                       Point Set 3


 0.2                            0.2                           0.2

 0.1                            0.1                           0.1
  0                              0                             0
−0.1                           −0.1
       0    0.2   0.4    0.6          0    0.2   0.4    0.6             0    0.2    0.4      0.6
           Point Set 4                    Point Set 5                       Point Set 6


 0.2                            0.2                           0.2

 0.1                            0.1                           0.1

  0                              0                             0
−0.1
       0    0.2   0.4    0.6          0    0.2    0.4   0.6             0    0.2    0.4      0.6
           Point Set 7                Before Registration               After Registration

 0.2                                                          0.2
                                0.2

 0.1                            0.1                           0.1

  0                              0
                                                               0
                               −0.1
       0    0.2    0.4   0.6          0    0.2   0.4    0.6         0       0.2    0.4    0.6




                                                                                                   20/29
With outliers



         Before Registration         After PDF−JS Registration               After CDF−HC Registration
                                                                       0.2
0.2                            0.2

0.1                            0.1                                     0.1

 0                              0
                                                                        0
  0.2   0.4   0.6   0.8   1          0.4    0.6   0.8    1       1.2           0     0.2    0.4   0.6




                                                                                                         21/29
With different α values

          Initial Configuration                                       α=2

0.2
                                                          0.2
0.1
                                                          0.1
  0
                                                            0
            0.5              1                                  0   0.2 0.4 0.6

                  α=1.1                      α=1.3                   α=1.5                    α=1.7

0.2                               0.2                     0.2                     0.2
0.1                               0.1                     0.1                     0.1
  0                                 0                       0                       0
      0       0.2 0.4 0.6               0   0.2 0.4 0.6         0   0.2 0.4 0.6         0   0.2 0.4 0.6

                  α=1.9                       α=3                     α=4                     α=5

0.2                               0.2                     0.2                     0.2
0.1                               0.1                     0.1                     0.1
  0                                 0                       0                       0
      0      0.2 0.4 0.6                0   0.2 0.4 0.6         0   0.2 0.4 0.6         0   0.2 0.4 0.6




                                                                                                          22/29
3D Point-set Registration for Duck
         Point Set 1                Point Set 2                 Point Set 3

 150                    150                           150

 100                    100                           100


  50                       50                          50


   0                        0                           0
                                                                              0
       0 100 200 40 0       0 100 200          40 0
                                              60 20      0 100 200         40
                                                                          60
                                                                             20
               60 20

         Point Set 4            Before Registration         After Registration

 150                    150                           150

 100                    100
                                                      100

  50                       50
                                                       50

   0                        0
                       0                                0
   0 100
           200   604020     0 100 200         40 0
                                             60 20          0                 0
                                                                100 200 50        23/29
3D Registration of Hippocampi
                  Point Set 1                            Point Set 2                                 Point Set 3
         100 0                                   100 0                                       100 0
                           100                                     100                                          100
                                  200                                         200                                            200
                                   0                                           0                                              0
    50                                      50                                          50
                                  10                                          10                                             10
                                  20                                          20                                             20
0                                       0                                           0




                  Point Set 4                Point Sets Before Registration                  Point sets After Registration
          100 0                                  100 0                                          100 0

                            100                                        100                                         100
                                  200        50                                          50
    50                             0                                          200                                            200
                                                                              0                                              0
                                  10                                          5
                                                                                                                             10
                                  20    0                                     10
                                                                                    0
0                                                                             15                                             20




                                                                                                                                   24/29
Group-Wise Registration Assessment

The Kolmogorov-Smirnov (KS) statistic was computed to measure
the difference between the CDFs.
    With ground truth
                               N
                           1
                                     D(Fg , Fk )
                           N
                               k=1

    Without ground truth
                                     N
                             1
                        K=                 D(Fk , Fs )
                             N2
                                   k,s=1




                                                                25/29
KS statistic for comparison


                     Table: KS statistic
  KS-statistic         CDF-JS       PDF-JS    CDF-HC
  Olympic Logo          0.1103       0.1018    0.0324
Fish with outliers      0.1314       0.1267    0.0722


       Table: Average nearest neighbor distance
 ANN distance          CDF-JS       PDF-JS    CDF-HC
  Olympic Logo          0.0367       0.0307    0.0019
Fish with outliers      0.0970       0.0610    0.0446

                                                        26/29
KS statistic for comparison (contd.)


Table: Non-rigid group-wise registration assessment without ground truth
using KS statistics


                                 Before Registration     After Registration
      Corpus Callosum                  0.3226                  0.0635
 Corpus Callosum with outlier          0.3180                  0.0742
        Olympic Logo                   0.1559                  0.0308
             Fish                      0.1102                  0.0544
        Hippocampus                    0.2620                  0.0770
            Duck                       0.2287                  0.0160


                                                                           27/29
KS statistic for comparison (contd.)


Table: Non-rigid group-wise registration assessment without ground truth
using average nearest neighbor distance


                                 Before Registration     After Registration
      Corpus Callosum                  0.0291                  0.0029
 Corpus Callosum with outlier          0.0288                  0.0092
        Olympic Logo                   0.0825                  0.0022
             Fish                      0.1461                  0.0601
        Hippocampus                    13.7679                 3.1779
            Duck                       15.4725                 0.3280


                                                                           28/29
Discussion



I-divergences for shape matching avoid correspondence problem
Symmetric, unbiased registration and atlas estimation
Shape densities modeled as Gaussian mixtures, cumulatives
directly estimated
JS (pdf and cdf-based) and HC divergences used
Estimated atlas useful in model-based segmentation




                                                                29/29

Mais conteúdo relacionado

Mais procurados

CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slidesCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
zukun
 
Chapter 2 pertubation
Chapter 2 pertubationChapter 2 pertubation
Chapter 2 pertubation
NBER
 
01 graphical models
01 graphical models01 graphical models
01 graphical models
zukun
 
Dragisa Zunic - Classical computing with explicit structural rules - the *X c...
Dragisa Zunic - Classical computing with explicit structural rules - the *X c...Dragisa Zunic - Classical computing with explicit structural rules - the *X c...
Dragisa Zunic - Classical computing with explicit structural rules - the *X c...
Dragisa Zunic
 
Lecture11
Lecture11Lecture11
Lecture11
Bo Li
 
Au2419291933
Au2419291933Au2419291933
Au2419291933
IJMER
 

Mais procurados (19)

CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slidesCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
 
Discrete Models in Computer Vision
Discrete Models in Computer VisionDiscrete Models in Computer Vision
Discrete Models in Computer Vision
 
Chapter 2 pertubation
Chapter 2 pertubationChapter 2 pertubation
Chapter 2 pertubation
 
AlgoPerm2012 - 09 Vincent Pilaud
AlgoPerm2012 - 09 Vincent PilaudAlgoPerm2012 - 09 Vincent Pilaud
AlgoPerm2012 - 09 Vincent Pilaud
 
01 graphical models
01 graphical models01 graphical models
01 graphical models
 
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
 
Tro07 sparse-solutions-talk
Tro07 sparse-solutions-talkTro07 sparse-solutions-talk
Tro07 sparse-solutions-talk
 
International Journal of Computational Engineering Research(IJCER)
International Journal of Computational Engineering Research(IJCER)International Journal of Computational Engineering Research(IJCER)
International Journal of Computational Engineering Research(IJCER)
 
Dragisa Zunic - Classical computing with explicit structural rules - the *X c...
Dragisa Zunic - Classical computing with explicit structural rules - the *X c...Dragisa Zunic - Classical computing with explicit structural rules - the *X c...
Dragisa Zunic - Classical computing with explicit structural rules - the *X c...
 
Parameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial DecisionsParameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial Decisions
 
The Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysisThe Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysis
 
Elementary Landscape Decomposition of Combinatorial Optimization Problems
Elementary Landscape Decomposition of Combinatorial Optimization ProblemsElementary Landscape Decomposition of Combinatorial Optimization Problems
Elementary Landscape Decomposition of Combinatorial Optimization Problems
 
Gentle Introduction to Dirichlet Processes
Gentle Introduction to Dirichlet ProcessesGentle Introduction to Dirichlet Processes
Gentle Introduction to Dirichlet Processes
 
Lecture11
Lecture11Lecture11
Lecture11
 
Paper Summary of Disentangling by Factorising (Factor-VAE)
Paper Summary of Disentangling by Factorising (Factor-VAE)Paper Summary of Disentangling by Factorising (Factor-VAE)
Paper Summary of Disentangling by Factorising (Factor-VAE)
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58
 
Description and retrieval of medical visual information based on language mod...
Description and retrieval of medical visual information based on language mod...Description and retrieval of medical visual information based on language mod...
Description and retrieval of medical visual information based on language mod...
 
Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4
 
Au2419291933
Au2419291933Au2419291933
Au2419291933
 

Destaque

ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...
ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...
ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...
zukun
 
Cvpr2010 open source vision software, intro and training part iv generic imag...
Cvpr2010 open source vision software, intro and training part iv generic imag...Cvpr2010 open source vision software, intro and training part iv generic imag...
Cvpr2010 open source vision software, intro and training part iv generic imag...
zukun
 
00 introduction
00 introduction00 introduction
00 introduction
zukun
 
Fcv rep hoiem
Fcv rep hoiemFcv rep hoiem
Fcv rep hoiem
zukun
 
CVPR2010: higher order models in computer vision: Part 4
CVPR2010: higher order models in computer vision: Part 4 CVPR2010: higher order models in computer vision: Part 4
CVPR2010: higher order models in computer vision: Part 4
zukun
 
Fcv taxo perona
Fcv taxo peronaFcv taxo perona
Fcv taxo perona
zukun
 
ICCV2009: MAP Inference in Discrete Models: Part 2
ICCV2009: MAP Inference in Discrete Models: Part 2ICCV2009: MAP Inference in Discrete Models: Part 2
ICCV2009: MAP Inference in Discrete Models: Part 2
zukun
 
NIPS2007: learning using many examples
NIPS2007: learning using many examplesNIPS2007: learning using many examples
NIPS2007: learning using many examples
zukun
 
Mit6870 orsu lecture11
Mit6870 orsu lecture11Mit6870 orsu lecture11
Mit6870 orsu lecture11
zukun
 
Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...
zukun
 
CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...
CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...
CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...
zukun
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...
zukun
 
A general survey of previous works on action recognition
A general survey of previous works on action recognitionA general survey of previous works on action recognition
A general survey of previous works on action recognition
zukun
 
ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2
zukun
 
Cvpr2010 open source vision software, intro and training part vii point cloud...
Cvpr2010 open source vision software, intro and training part vii point cloud...Cvpr2010 open source vision software, intro and training part vii point cloud...
Cvpr2010 open source vision software, intro and training part vii point cloud...
zukun
 
Power%20 point[1]
Power%20 point[1]Power%20 point[1]
Power%20 point[1]
thiberge
 
Catalogueprofessionnel2011
Catalogueprofessionnel2011Catalogueprofessionnel2011
Catalogueprofessionnel2011
thiberge
 
CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...
CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...
CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...
zukun
 
Fcv rep todorovic
Fcv rep todorovicFcv rep todorovic
Fcv rep todorovic
zukun
 
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2
zukun
 

Destaque (20)

ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...
ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...
ICCV2009: MAP Inference in Discrete Models: Part 6: Recent Advances in Convex...
 
Cvpr2010 open source vision software, intro and training part iv generic imag...
Cvpr2010 open source vision software, intro and training part iv generic imag...Cvpr2010 open source vision software, intro and training part iv generic imag...
Cvpr2010 open source vision software, intro and training part iv generic imag...
 
00 introduction
00 introduction00 introduction
00 introduction
 
Fcv rep hoiem
Fcv rep hoiemFcv rep hoiem
Fcv rep hoiem
 
CVPR2010: higher order models in computer vision: Part 4
CVPR2010: higher order models in computer vision: Part 4 CVPR2010: higher order models in computer vision: Part 4
CVPR2010: higher order models in computer vision: Part 4
 
Fcv taxo perona
Fcv taxo peronaFcv taxo perona
Fcv taxo perona
 
ICCV2009: MAP Inference in Discrete Models: Part 2
ICCV2009: MAP Inference in Discrete Models: Part 2ICCV2009: MAP Inference in Discrete Models: Part 2
ICCV2009: MAP Inference in Discrete Models: Part 2
 
NIPS2007: learning using many examples
NIPS2007: learning using many examplesNIPS2007: learning using many examples
NIPS2007: learning using many examples
 
Mit6870 orsu lecture11
Mit6870 orsu lecture11Mit6870 orsu lecture11
Mit6870 orsu lecture11
 
Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...
 
CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...
CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...
CVPR2010: Semi-supervised Learning in Vision: Part 3: Algorithms and Applicat...
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...
 
A general survey of previous works on action recognition
A general survey of previous works on action recognitionA general survey of previous works on action recognition
A general survey of previous works on action recognition
 
ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2
 
Cvpr2010 open source vision software, intro and training part vii point cloud...
Cvpr2010 open source vision software, intro and training part vii point cloud...Cvpr2010 open source vision software, intro and training part vii point cloud...
Cvpr2010 open source vision software, intro and training part vii point cloud...
 
Power%20 point[1]
Power%20 point[1]Power%20 point[1]
Power%20 point[1]
 
Catalogueprofessionnel2011
Catalogueprofessionnel2011Catalogueprofessionnel2011
Catalogueprofessionnel2011
 
CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...
CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...
CVPR2010: Sparse Coding and Dictionary Learning for Image Analysis: Part 3: O...
 
Fcv rep todorovic
Fcv rep todorovicFcv rep todorovic
Fcv rep todorovic
 
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 2
 

Semelhante a CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Divergences

CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: MixturesCVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
zukun
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handout
csedays
 

Semelhante a CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Divergences (20)

CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: MixturesCVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
 
Mcgill3
Mcgill3Mcgill3
Mcgill3
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programming
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
 
Multitask learning for GGM
Multitask learning for GGMMultitask learning for GGM
Multitask learning for GGM
 
1 - Linear Regression
1 - Linear Regression1 - Linear Regression
1 - Linear Regression
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
Slides: The dual Voronoi diagrams with respect to representational Bregman di...
Slides: The dual Voronoi diagrams with respect to representational Bregman di...Slides: The dual Voronoi diagrams with respect to representational Bregman di...
Slides: The dual Voronoi diagrams with respect to representational Bregman di...
 
The Probability that a Matrix of Integers Is Diagonalizable
The Probability that a Matrix of Integers Is DiagonalizableThe Probability that a Matrix of Integers Is Diagonalizable
The Probability that a Matrix of Integers Is Diagonalizable
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processing
 
www.ijerd.com
www.ijerd.comwww.ijerd.com
www.ijerd.com
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
Matrix Models of 2D String Theory in Non-trivial Backgrounds
Matrix Models of 2D String Theory in Non-trivial BackgroundsMatrix Models of 2D String Theory in Non-trivial Backgrounds
Matrix Models of 2D String Theory in Non-trivial Backgrounds
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handout
 
Cd Simon
Cd SimonCd Simon
Cd Simon
 
Patch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesPatch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective Divergences
 
Mixture Models for Image Analysis
Mixture Models for Image AnalysisMixture Models for Image Analysis
Mixture Models for Image Analysis
 
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
 
Interpolation
InterpolationInterpolation
Interpolation
 
Linear models for classification
Linear models for classificationLinear models for classification
Linear models for classification
 

Mais de zukun

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009
zukun
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCV
zukun
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Information
zukun
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statistics
zukun
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibration
zukun
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer vision
zukun
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluation
zukun
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-software
zukun
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
zukun
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
zukun
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-intro
zukun
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video search
zukun
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video search
zukun
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video search
zukun
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learning
zukun
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer vision
zukun
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick start
zukun
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
zukun
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structures
zukun
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities
zukun
 

Mais de zukun (20)

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCV
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Information
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statistics
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibration
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer vision
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluation
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-software
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-intro
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video search
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video search
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video search
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learning
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer vision
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick start
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structures
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities
 

Último

1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 

Último (20)

psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 

CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Divergences

  • 1. A vne da cd Ifr t nT e r i nomai h oyn o C P “ aN t e” VRi n us l hl CP VR T ti u rl oa J n 1 -82 1 u e 31 0 0 S nFa c c ,A a rn i oC s Shape Matching with I-divergences Anand Rangarajan
  • 2. Shape Matching with I-Divergences Groupwise Point-set Pattern Registration Given N point-sets, which are denoted by {X p , p ∈ {1, ..., N}}, the task of multiple point pattern matching or point-set registration is to recover the spatial transformations which yield the best alignment of all shapes. 2/29
  • 5. Group-wise Point-set Registration Principal Technical Challenges Solving for nonrigid deformations between point-sets with unknown correspondence is a difficult problem. How do we align all the point-sets in a symmetric manner so that there is no bias toward any particular point-set? 4/29
  • 6. From point-sets to density functions 5/29
  • 7. From point-sets to density functions 5/29
  • 8. Group-wise Point-set Registration From point-sets to density functions Point sets are represented by probability density functions. Intuitively, if these point sets are aligned properly, the corresponding density functions should be similar. 6/29
  • 9. Group-wise Point-set Registration From point-sets to density functions Point sets are represented by probability density functions. Intuitively, if these point sets are aligned properly, the corresponding density functions should be similar. Question: How do we measure the similarity between multiple density functions? 6/29
  • 10. Divergence Measures Kullback-Leibler divergence ˆ p(x) DKL (p q) = p(x) log dx q(x) where p(x), q(x) are the probability density functions. 7/29
  • 11. Divergence Measures Kullback-Leibler divergence J divergence ˆ Given two probability density p(x) function p and q, the symmetric KL DKL (p q) = p(x) log dx q(x) divergence is defined as: where p(x), q(x) are the probability 1 J(p, q) = (DKL (p q) + DKL (q p)) density functions. 2 7/29
  • 12. Motivating the JS divergence Modeling two shapes X Y N1 K1 N2 K2 (1) 1 (1) (2) 1 (2) p(X |θ )= p(Xi |θa ), p(Y |θ )= p(Yj |θb ) K1 K2 i=1 a=1 j=1 b=1 8/29
  • 13. Motivating the JS divergence Modeling the overlay of two shapes with identity of origin X Y p(X ∪ Y |θ(1) , θ(2) ) = p(X |θ(1) )p(Y |θ(2) ) 8/29
  • 14. Motivating the JS divergence Modeling the overlay of two shapes without identity of origin Z N1 N2 p(Z |θ(1) , θ(2) ) = p(Z |θ(1) ) + p(Z |θ(2) ) N1 + N2 N1 + N2 8/29
  • 15. Likelihood Ratio Which generative model do you prefer? The union of disparate shapes where identity of origin is preserved or one combined shape where the identity of origin is suppressed. Likelihood ratio: N1 (1) N2 (2) p(Z |θ(1) , θ(2) ) N1 +N2 p(Z |θ ) + N1 +N2 p(Z |θ ) log Λ = log = p(X ∪ Y |θ(1) , θ(2) ) p(X |θ(1) )p(Y |θ(2) ) Z is understood to arise from a convex combination of two mixture models p(Z |θ(1) ) and p(Z |θ(2) ) where the weights of each mixture are proportional to the number of points N1 and N2 in each set. Weak law of large numbers leads to Jensen-Shannon divergence. 9/29
  • 16. JS Divergence for multiple shapes JS-divergence of shape densities JSπ (P1 , P2 , ..., Pn ) = H( π i Pi ) − πi H(Pi ) (1) where π = {π1 , π2 , ..., πn |πi > 0, πi = 1} are the weights of the probability densities Pi and H(Pi ) is the Shannon entropy. 10/29
  • 17. Atlas estimation Formulation using JS-divergence N JSβ (P1 , P2 , ..., PN ) + λ ||Lf i ||2 i=1 N =H( β i Pi ) − βi H(Pi ) + λ ||Lf i ||2 . i=1 f i is the deformation function corresponding to point set X i ; Pi = p(f i (X i )) is the probability density for deformed point-set. 11/29
  • 18. Multiple shapes: JS divergence JS divergence in a hypothesis testing framework: Construct a likelihood ratio between i.i.d. samples drawn from a mixture ( a πa Pa ) and i.i.d. samples drawn from a heterogeneous collection of densities (P1 , P2 , ..., PN ). The likelihood ratio is then M N k=1 a=1 πa Pa (xk ) Λ= N Na . a a=1 ka =1 Pa (xka ) Weak law of large numbers gives us the JS-divergence. 12/29
  • 19. Group-wise Registration Results Experimental results on four 3D hippocampus point sets. 13/29
  • 20. Shape matching via CDF I-divergences Model each point-set by a cumulative distribution function (CDF) Quantify the distance among cdfs via an information-theoretic measure [typically the cumulative residual entropy (CRE)] Minimize the dis-similarity measure over the space of coordinate transformation parameters 14/29
  • 21. Havrda-Charvát CRE HC-CRE: Let X be a random vector in R d , we define the HC-CRE of X by ˆ EH (X ) = − (α − 1)−1 (P α (|X | > λ) − P(|X | > λ))d λ d R+ where X = {x1 , x2 , . . . , xd }, λ = {λ1 , λ2 , . . . , λd }, and |X | > λ means |xi | > λi , R+ = {xi ∈ R d ; xi ≥ 0; i ∈ {1, 2, . . . , d }}. d 15/29
  • 22. CDF-HC Divergence CDF-HC Divergence : Given N cumulative probability distributions Pk , k ∈ {1, . . . , N}, the CDF-JS divergence of the set {Pk } is defined as HC (P1 , P2 , . . . , PN ) = EH ( πk Pk ) − πk EH (Pk ) k k where 0 ≤ πk ≤ 1, k πk = 1, and EH is the HC-CRE. 16/29
  • 23. CDF-HC Divergence Let P = k πk Pk HC (P1 , P2 , . . . , PN ) ˆ ˆ −1 = −(α − 1) ( P α (X > λ)d λ− πk α Pk (Xk > λ)d λ) d R+ d R+ k ˆ ˆ 2 = πk Pk (Xk > λ)d λ − P 2 (X > λ)d λ (α = 2) d R+ d R+ k 17/29
  • 24. Dirac Mixture Model Dk 1 Pk (Xk > λ) = H i (x, xi ) Dk i where H(x, xi ) is the Heaviside function (equal to 1 if all components of x are greater than xi ). 1 0.5 0 0 10 20 30 40 80 50 60 60 40 20 0 70 18/29
  • 25. CDF-JS, PDF-JS & CDF-HC Before Registraion CDF−JS PDF−JS CDF−HC 3.5 3.5 3.5 3.5 3 3 3 3 2.5 2.5 2.5 2.5 2 2 2 2 1.5 1.5 1.5 1.5 1 1 1 1 0.5 0.5 0.5 0.5 0 0 0 0 0 1 2 0 1 2 0 1 2 0 1 2 Before Registraion CDF−JS PDF−JS CDF−HC 4 4 4 4 2 2 2 2 0 0 0 0 0 2 4 6 8 0 2 4 6 8 0 2 4 6 8 0 2 4 6 8 19/29
  • 26. 2D Point-set Registration for CC Point Set 1 Point Set 2 Point Set 3 0.2 0.2 0.2 0.1 0.1 0.1 0 0 0 −0.1 −0.1 0 0.2 0.4 0.6 0 0.2 0.4 0.6 0 0.2 0.4 0.6 Point Set 4 Point Set 5 Point Set 6 0.2 0.2 0.2 0.1 0.1 0.1 0 0 0 −0.1 0 0.2 0.4 0.6 0 0.2 0.4 0.6 0 0.2 0.4 0.6 Point Set 7 Before Registration After Registration 0.2 0.2 0.2 0.1 0.1 0.1 0 0 0 −0.1 0 0.2 0.4 0.6 0 0.2 0.4 0.6 0 0.2 0.4 0.6 20/29
  • 27. With outliers Before Registration After PDF−JS Registration After CDF−HC Registration 0.2 0.2 0.2 0.1 0.1 0.1 0 0 0 0.2 0.4 0.6 0.8 1 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 21/29
  • 28. With different α values Initial Configuration α=2 0.2 0.2 0.1 0.1 0 0 0.5 1 0 0.2 0.4 0.6 α=1.1 α=1.3 α=1.5 α=1.7 0.2 0.2 0.2 0.2 0.1 0.1 0.1 0.1 0 0 0 0 0 0.2 0.4 0.6 0 0.2 0.4 0.6 0 0.2 0.4 0.6 0 0.2 0.4 0.6 α=1.9 α=3 α=4 α=5 0.2 0.2 0.2 0.2 0.1 0.1 0.1 0.1 0 0 0 0 0 0.2 0.4 0.6 0 0.2 0.4 0.6 0 0.2 0.4 0.6 0 0.2 0.4 0.6 22/29
  • 29. 3D Point-set Registration for Duck Point Set 1 Point Set 2 Point Set 3 150 150 150 100 100 100 50 50 50 0 0 0 0 0 100 200 40 0 0 100 200 40 0 60 20 0 100 200 40 60 20 60 20 Point Set 4 Before Registration After Registration 150 150 150 100 100 100 50 50 50 0 0 0 0 0 100 200 604020 0 100 200 40 0 60 20 0 0 100 200 50 23/29
  • 30. 3D Registration of Hippocampi Point Set 1 Point Set 2 Point Set 3 100 0 100 0 100 0 100 100 100 200 200 200 0 0 0 50 50 50 10 10 10 20 20 20 0 0 0 Point Set 4 Point Sets Before Registration Point sets After Registration 100 0 100 0 100 0 100 100 100 200 50 50 50 0 200 200 0 0 10 5 10 20 0 10 0 0 15 20 24/29
  • 31. Group-Wise Registration Assessment The Kolmogorov-Smirnov (KS) statistic was computed to measure the difference between the CDFs. With ground truth N 1 D(Fg , Fk ) N k=1 Without ground truth N 1 K= D(Fk , Fs ) N2 k,s=1 25/29
  • 32. KS statistic for comparison Table: KS statistic KS-statistic CDF-JS PDF-JS CDF-HC Olympic Logo 0.1103 0.1018 0.0324 Fish with outliers 0.1314 0.1267 0.0722 Table: Average nearest neighbor distance ANN distance CDF-JS PDF-JS CDF-HC Olympic Logo 0.0367 0.0307 0.0019 Fish with outliers 0.0970 0.0610 0.0446 26/29
  • 33. KS statistic for comparison (contd.) Table: Non-rigid group-wise registration assessment without ground truth using KS statistics Before Registration After Registration Corpus Callosum 0.3226 0.0635 Corpus Callosum with outlier 0.3180 0.0742 Olympic Logo 0.1559 0.0308 Fish 0.1102 0.0544 Hippocampus 0.2620 0.0770 Duck 0.2287 0.0160 27/29
  • 34. KS statistic for comparison (contd.) Table: Non-rigid group-wise registration assessment without ground truth using average nearest neighbor distance Before Registration After Registration Corpus Callosum 0.0291 0.0029 Corpus Callosum with outlier 0.0288 0.0092 Olympic Logo 0.0825 0.0022 Fish 0.1461 0.0601 Hippocampus 13.7679 3.1779 Duck 15.4725 0.3280 28/29
  • 35. Discussion I-divergences for shape matching avoid correspondence problem Symmetric, unbiased registration and atlas estimation Shape densities modeled as Gaussian mixtures, cumulatives directly estimated JS (pdf and cdf-based) and HC divergences used Estimated atlas useful in model-based segmentation 29/29