SlideShare uma empresa Scribd logo
1 de 30
Baixar para ler offline
Semi-automatic ABCA Discussion




                                    Semi-automatic ABC
                                       A Discussion

                                        Christian P. Robert

                                 Universit´ Paris-Dauphine, IuF, & CREST
                                          e
                                    http://xianblog.wordpress.com


                                   November 2, 2011
                      LA TEX code borrowed from arXiv:1004.1112v2
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




Approximate Bayesian computation




      Approximate Bayesian computation (recap)

      Summary statistic selection
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




Regular Bayesian computation issues


      When faced with a non-standard posterior distribution

                                             π(θ|y) ∝ π(θ)L(θ|y)

      the standard solution is to use simulation (Monte Carlo) to
      produce a sample
                                   θ1 , . . . , θ T
      from π(θ|y) (or approximately by Markov chain Monte Carlo
      methods)
                                              [Robert & Casella, 2004]
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




Untractable likelihoods



      Cases when the likelihood function f (y|θ) is unavailable and when
      the completion step

                                       f (y|θ) =       f (y, z|θ) dz
                                                   Z

      is impossible or too costly because of the dimension of z
       c MCMC cannot be implemented!
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




Untractable likelihoods




                                 c MCMC cannot be implemented!
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




The ABC method

      Bayesian setting: target is π(θ)f (x|θ)
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




The ABC method

      Bayesian setting: target is π(θ)f (x|θ)
      When likelihood f (x|θ) not in closed form, likelihood-free rejection
      technique:
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




The ABC method

      Bayesian setting: target is π(θ)f (x|θ)
      When likelihood f (x|θ) not in closed form, likelihood-free rejection
      technique:
      ABC algorithm
      For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
      simulating
                           θ ∼ π(θ) , z ∼ f (z|θ ) ,
      until the auxiliary variable z is equal to the observed value, z = y.

                                             [Rubin, 1984; Tavar´ et al., 1997]
                                                                e
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




A as approximative


      When y is a continuous random variable, equality z = y is replaced
      with a tolerance condition,

                                             (y, z) ≤

      where        is a distance
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




A as approximative


      When y is a continuous random variable, equality z = y is replaced
      with a tolerance condition,

                                             (y, z) ≤

      where is a distance
      Output distributed from

                           π(θ) Pθ { (y, z) < } ∝ π(θ| (y, z) < )
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




ABC algorithm


      Algorithm 1 Likelihood-free rejection sampler
        for i = 1 to N do
          repeat
             generate θ from the prior distribution π(·)
             generate z from the likelihood f (·|θ )
          until ρ{η(z), η(y)} ≤
          set θi = θ
        end for

      where η(y) defines a (generaly in-sufficient) statistic
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




Output

      The likelihood-free algorithm samples from the marginal in z of:

                                              π(θ)f (z|θ)IA ,y (z)
                            π (θ, z|y) =                             ,
                                             A ,y ×Θ π(θ)f (z|θ)dzdθ

      where A       ,y   = {z ∈ D|ρ(η(z), η(y)) < }.
Semi-automatic ABCA Discussion
  Approximate Bayesian computation (recap)




Output

      The likelihood-free algorithm samples from the marginal in z of:

                                               π(θ)f (z|θ)IA ,y (z)
                            π (θ, z|y) =                              ,
                                              A ,y ×Θ π(θ)f (z|θ)dzdθ

      where A       ,y   = {z ∈ D|ρ(η(z), η(y)) < }.
      The idea behind ABC is that the summary statistics coupled with a
      small tolerance should provide a good approximation of the
      posterior distribution:

                           π (θ|y) =         π (θ, z|y)dz ≈ π(θ|η(y)) .

                                                                    [Not garanteed!]
Semi-automatic ABCA Discussion
  Summary statistic selection




Summary statistic selection



      Approximate Bayesian computation (recap)

      Summary statistic selection
         F&P’s setting
         Noisy ABC
         Optimal summary statistic
Semi-automatic ABCA Discussion
  Summary statistic selection
     F&P’s setting


F&P’s ABC

      Use of a summary statistic S(·), an importance proposal g(·), a
      kernel K(·) ≤ 1 and a bandwidth h > 0 such that

                                 (θ, ysim ) ∼ g(θ)f (ysim |θ)

      is accepted with probability (hence the bound)

                                  K[{S(ysim ) − sobs }/h]

                           [or is it K[{S(ysim ) − sobs }]/h, cf (2)? typo]
      the corresponding importance weight defined by

                                         π(θ) g(θ)
Semi-automatic ABCA Discussion
  Summary statistic selection
     F&P’s setting


Errors, errors, and errors


      Three levels of approximation
              π(θ|yobs ) by π(θ|sobs ) loss of information
              π(θ|sobs ) by

                                             π(s)K[{s − sobs }/h]π(θ|s) ds
                          πABC (θ|sobs ) =
                                                π(s)K[{s − sobs }/h] ds

              noisy observations
              πABC (θ|sobs ) by importance Monte Carlo based on N
              simulations, represented by var(a(θ)|sobs )/Nacc [expected
              number of acceptances]
Semi-automatic ABCA Discussion
  Summary statistic selection
     F&P’s setting


Average acceptance asymptotics


      For the average acceptance probability/approximate likelihood

                 p(θ|sobs ) =     f (ysim |θ) K[{S(ysim ) − sobs }/h] dysim ,

      overall acceptance probability

                     p(sobs ) =   p(θ|sobs ) π(θ) dθ = π(sobs )hd + o(hd )

                                                                        [Lemma 1]
Semi-automatic ABCA Discussion
  Summary statistic selection
     F&P’s setting


Optimal importance proposal




      Best choice of importance proposal in terms of effective sample size

                                 g (θ|sobs ) ∝ π(θ)p(θ|sobs )1/2

                                               [Not particularly useful in practice]
Semi-automatic ABCA Discussion
  Summary statistic selection
     F&P’s setting


Calibration of h

              “This result gives insight into how S(·) and h affect the
              Monte Carlo error. To minimize Monte Carlo error, we
              need hd to be not too small. Thus ideally we want S(·)
              to be a low dimensional summary of the data that is
              sufficiently informative about θ that π(θ|sobs ) is close, in
              some sense, to π(θ|yobs )” (p.5)

              Constraint on h only addresses one term in the approximation
              error and acceptance probability
              h large prevents π(θ|sobs ) to be close to πABC (θ|sobs )
              d small prevents π(θ|sobs ) to be close to π(θ|yobs )
Semi-automatic ABCA Discussion
  Summary statistic selection
     Noisy ABC


Calibrated ABC



      Definition
      For 0 < q < 1 and subset A, fix [one specific?/all?] event Eq (A)
      with PrABC (θ ∈ Eq (A)|sobs ) = q. Then ABC is calibrated if

                                 Pr(θ ∈ A|Eq (A)) = q



      Why calibrated and not exact?
Semi-automatic ABCA Discussion
  Summary statistic selection
     Noisy ABC


Calibrated ABC



      Theorem
      Noisy ABC, where

                                 sobs = S(yobs ) + h ,   ∼ K(·)

      is calibrated
                                                              [Wilkinson, 2008]
                                   no condition on h
Semi-automatic ABCA Discussion
  Summary statistic selection
     Noisy ABC


Calibrated ABC


      Theorem
      For noisy ABC, the expected noisy-ABC log-likelihood,

       E {log[p(θ|sobs )]} =     log[p(θ|S(yobs ) + h )]π(yobs |θ0 )K( )dyobs dx,

      has its maximum at θ = θ0 .

                                          [Last line of proof contains a typo]

      True for any choice of summary statistic?
                                       [Imposes at least identifiability...]
      Relevant in asymptotia and not for the data
Semi-automatic ABCA Discussion
  Summary statistic selection
     Noisy ABC


Calibrated ABC



      Corollary
      For noisy ABC, the ABC posterior converges onto a point mass on
      the true parameter value as m → ∞.

      For standard ABC, not always the case (unless h goes to zero).
      Strength of regularity conditions (c1) and (c2) in Bernardo
      & Smith, 1994?
                                                 [constraints on posterior]
      Some condition upon summary statistic?
Semi-automatic ABCA Discussion
  Summary statistic selection
     Optimal summary statistic


Loss motivated statistic

      Under quadratic loss function,
      Theorem
                                               ˆ
        (i) The minimal posterior error E[L(θ, θ)|yobs ] occurs when
            ˆ = E(θ|yobs ) (!)
            θ
       (ii) When h → 0, EABC (θ|sobs ) converges to E(θ|yobs )
                                               ˆ
         iii If S(yobs ) = E[θ|yobs ] then for θ = EABC [θ|sobs ]

                       ˆ
                E[L(θ, θ)|yobs ] = trace(AΣ) + h2   xT AxK(x)dx + o(h2 ).

      Measure-theoretic difficulties?
      dependence of sobs on h makes me uncomfortable
      Relevant for choice of K?
Semi-automatic ABCA Discussion
  Summary statistic selection
     Optimal summary statistic


Optimal summary statistic
              “We take a different approach, and weaken the
              requirement for πABC to be a good approximation to
              π(θ|yobs ). We argue for πABC to be a good
              approximation solely in terms of the accuracy of certain
              estimates of the parameters.” (p.5)

      From this result, F&P derive their choice of summary statistic,

                                      S(y) = E(θ|y)

                                                            [almost sufficient]
      suggest

                        h = O(N −1/(2+d) )   and h = O(N −1/(4+d) )

      as optimal bandwidths for noisy and standard ABC.
Semi-automatic ABCA Discussion
  Summary statistic selection
     Optimal summary statistic


Optimal summary statistic
              “We take a different approach, and weaken the
              requirement for πABC to be a good approximation to
              π(θ|yobs ). We argue for πABC to be a good
              approximation solely in terms of the accuracy of certain
              estimates of the parameters.” (p.5)

      From this result, F&P derive their choice of summary statistic,

                                      S(y) = E(θ|y)

                                                  [EABC [θ|S(yobs )] = E[θ|yobs ]]
      suggest

                        h = O(N −1/(2+d) )   and h = O(N −1/(4+d) )

      as optimal bandwidths for noisy and standard ABC.
Semi-automatic ABCA Discussion
  Summary statistic selection
     Optimal summary statistic


Caveat



      Since E(θ|yobs ) is most usually unavailable, F&P suggest
        (i) use a pilot run of ABC to determine a region of non-negligible
            posterior mass;
       (ii) simulate sets of parameter values and data;
      (iii) use the simulated sets of parameter values and data to
            estimate the summary statistic; and
      (iv) run ABC with this choice of summary statistic.
Semi-automatic ABCA Discussion
  Summary statistic selection
     Optimal summary statistic


Approximating the summary statistic




      As Beaumont et al. (2002) and Blum and Fran¸ois (2010), F&P
                                                      c
      use a linear regression to approximate E(θ|yobs ):
                                        (i)
                                 θi = β0 + β (i) f (yobs ) +   i
Semi-automatic ABCA Discussion
  Summary statistic selection
     Optimal summary statistic


Applications



      The paper’s second half covers:
              g-and-k-distribution
              stochastic kinetic biochemical networks
              LotkaVolterra model
              Ricker map ecological model
              M/G/1-queue
              tuberculosis bacteria genotype data
Semi-automatic ABCA Discussion
  Summary statistic selection
     Optimal summary statistic


Questions

              dependence on h and S(·) in the early stage
              reduction of Bayesian inference to point estimation
              approximation error in step (iii) not accounted for
              not parameterisation invariant
              practice shows that proper approximation to genuine posterior
              distributions stems from using a (much) larger number of
              summary statistics than the dimension of the parameter
              the validity of the approximation to the optimal summary
              statistic depends on the quality of the pilot run;
              important inferential issues like model choice are not covered
              by this approach.

Mais conteúdo relacionado

Mais procurados

Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methodsChristian Robert
 
RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010Christian Robert
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsChristian Robert
 
ABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified modelsABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified modelsChristian Robert
 
On complementarity in qec and quantum cryptography
On complementarity in qec and quantum cryptographyOn complementarity in qec and quantum cryptography
On complementarity in qec and quantum cryptographywtyru1989
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferenceChristian Robert
 
Understanding Dynamic Programming through Bellman Operators
Understanding Dynamic Programming through Bellman OperatorsUnderstanding Dynamic Programming through Bellman Operators
Understanding Dynamic Programming through Bellman OperatorsAshwin Rao
 
Loss Calibrated Variational Inference
Loss Calibrated Variational InferenceLoss Calibrated Variational Inference
Loss Calibrated Variational InferenceTomasz Kusmierczyk
 
Bathymetry smoothing in ROMS: A new approach
Bathymetry smoothing in ROMS: A new approachBathymetry smoothing in ROMS: A new approach
Bathymetry smoothing in ROMS: A new approachMathieu Dutour Sikiric
 
Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Tomasz Kusmierczyk
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Christian Robert
 
Sampling Spectrahedra: Volume Approximation and Optimization
Sampling Spectrahedra: Volume Approximation and OptimizationSampling Spectrahedra: Volume Approximation and Optimization
Sampling Spectrahedra: Volume Approximation and OptimizationApostolos Chalkis
 
Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Christian Robert
 
My PhD defence
My PhD defenceMy PhD defence
My PhD defenceJialin LIU
 
RuleML 2015 Constraint Handling Rules - What Else?
RuleML 2015 Constraint Handling Rules - What Else?RuleML 2015 Constraint Handling Rules - What Else?
RuleML 2015 Constraint Handling Rules - What Else?RuleML
 
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Don Sheehy
 

Mais procurados (20)

Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methods
 
RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
 
ABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified modelsABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified models
 
On complementarity in qec and quantum cryptography
On complementarity in qec and quantum cryptographyOn complementarity in qec and quantum cryptography
On complementarity in qec and quantum cryptography
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conference
 
Understanding Dynamic Programming through Bellman Operators
Understanding Dynamic Programming through Bellman OperatorsUnderstanding Dynamic Programming through Bellman Operators
Understanding Dynamic Programming through Bellman Operators
 
Loss Calibrated Variational Inference
Loss Calibrated Variational InferenceLoss Calibrated Variational Inference
Loss Calibrated Variational Inference
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Bathymetry smoothing in ROMS: A new approach
Bathymetry smoothing in ROMS: A new approachBathymetry smoothing in ROMS: A new approach
Bathymetry smoothing in ROMS: A new approach
 
Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Introduction to modern Variational Inference.
Introduction to modern Variational Inference.
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017
 
Sampling Spectrahedra: Volume Approximation and Optimization
Sampling Spectrahedra: Volume Approximation and OptimizationSampling Spectrahedra: Volume Approximation and Optimization
Sampling Spectrahedra: Volume Approximation and Optimization
 
Shanghai tutorial
Shanghai tutorialShanghai tutorial
Shanghai tutorial
 
Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
My PhD defence
My PhD defenceMy PhD defence
My PhD defence
 
RuleML 2015 Constraint Handling Rules - What Else?
RuleML 2015 Constraint Handling Rules - What Else?RuleML 2015 Constraint Handling Rules - What Else?
RuleML 2015 Constraint Handling Rules - What Else?
 
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
 

Semelhante a Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011

seminar at Princeton University
seminar at Princeton Universityseminar at Princeton University
seminar at Princeton UniversityChristian Robert
 
Colloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi KünschColloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi KünschChristian Robert
 
Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]Christian Robert
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapterChristian Robert
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceChristian Robert
 
Approximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelApproximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelMatt Moores
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABCChristian Robert
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapterChristian Robert
 
ABC in London, May 5, 2011
ABC in London, May 5, 2011ABC in London, May 5, 2011
ABC in London, May 5, 2011Christian Robert
 
NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015Christian Robert
 

Semelhante a Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011 (20)

ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
 
Edinburgh, Bayes-250
Edinburgh, Bayes-250Edinburgh, Bayes-250
Edinburgh, Bayes-250
 
seminar at Princeton University
seminar at Princeton Universityseminar at Princeton University
seminar at Princeton University
 
Colloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi KünschColloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi Künsch
 
BIRS 12w5105 meeting
BIRS 12w5105 meetingBIRS 12w5105 meeting
BIRS 12w5105 meeting
 
Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]
 
ABC model choice
ABC model choiceABC model choice
ABC model choice
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
Intro to ABC
Intro to ABCIntro to ABC
Intro to ABC
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
von Mises lecture, Berlin
von Mises lecture, Berlinvon Mises lecture, Berlin
von Mises lecture, Berlin
 
Boston talk
Boston talkBoston talk
Boston talk
 
Approximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelApproximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts model
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABC
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapter
 
ABC in London, May 5, 2011
ABC in London, May 5, 2011ABC in London, May 5, 2011
ABC in London, May 5, 2011
 
NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015
 
Lex analysis
Lex analysisLex analysis
Lex analysis
 

Mais de Christian Robert

Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceChristian Robert
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsChristian Robert
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distancesChristian Robert
 
short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018Christian Robert
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distancesChristian Robert
 
prior selection for mixture estimation
prior selection for mixture estimationprior selection for mixture estimation
prior selection for mixture estimationChristian Robert
 

Mais de Christian Robert (20)

Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distances
 
short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distances
 
prior selection for mixture estimation
prior selection for mixture estimationprior selection for mixture estimation
prior selection for mixture estimation
 

Último

Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfSherif Taha
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Association for Project Management
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsKarakKing
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.pptRamjanShidvankar
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfNirmal Dwivedi
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...Poonam Aher Patil
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...ZurliaSoop
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxEsquimalt MFRC
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfPoh-Sun Goh
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - Englishneillewis46
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxAmanpreet Kaur
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxJisc
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxDr. Sarita Anand
 

Último (20)

Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptx
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 

Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011

  • 1. Semi-automatic ABCA Discussion Semi-automatic ABC A Discussion Christian P. Robert Universit´ Paris-Dauphine, IuF, & CREST e http://xianblog.wordpress.com November 2, 2011 LA TEX code borrowed from arXiv:1004.1112v2
  • 2. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) Approximate Bayesian computation Approximate Bayesian computation (recap) Summary statistic selection
  • 3. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) Regular Bayesian computation issues When faced with a non-standard posterior distribution π(θ|y) ∝ π(θ)L(θ|y) the standard solution is to use simulation (Monte Carlo) to produce a sample θ1 , . . . , θ T from π(θ|y) (or approximately by Markov chain Monte Carlo methods) [Robert & Casella, 2004]
  • 4. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) Untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented!
  • 5. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) Untractable likelihoods c MCMC cannot be implemented!
  • 6. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) The ABC method Bayesian setting: target is π(θ)f (x|θ)
  • 7. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) The ABC method Bayesian setting: target is π(θ)f (x|θ) When likelihood f (x|θ) not in closed form, likelihood-free rejection technique:
  • 8. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) The ABC method Bayesian setting: target is π(θ)f (x|θ) When likelihood f (x|θ) not in closed form, likelihood-free rejection technique: ABC algorithm For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly simulating θ ∼ π(θ) , z ∼ f (z|θ ) , until the auxiliary variable z is equal to the observed value, z = y. [Rubin, 1984; Tavar´ et al., 1997] e
  • 9. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) A as approximative When y is a continuous random variable, equality z = y is replaced with a tolerance condition, (y, z) ≤ where is a distance
  • 10. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) A as approximative When y is a continuous random variable, equality z = y is replaced with a tolerance condition, (y, z) ≤ where is a distance Output distributed from π(θ) Pθ { (y, z) < } ∝ π(θ| (y, z) < )
  • 11. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) ABC algorithm Algorithm 1 Likelihood-free rejection sampler for i = 1 to N do repeat generate θ from the prior distribution π(·) generate z from the likelihood f (·|θ ) until ρ{η(z), η(y)} ≤ set θi = θ end for where η(y) defines a (generaly in-sufficient) statistic
  • 12. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) Output The likelihood-free algorithm samples from the marginal in z of: π(θ)f (z|θ)IA ,y (z) π (θ, z|y) = , A ,y ×Θ π(θ)f (z|θ)dzdθ where A ,y = {z ∈ D|ρ(η(z), η(y)) < }.
  • 13. Semi-automatic ABCA Discussion Approximate Bayesian computation (recap) Output The likelihood-free algorithm samples from the marginal in z of: π(θ)f (z|θ)IA ,y (z) π (θ, z|y) = , A ,y ×Θ π(θ)f (z|θ)dzdθ where A ,y = {z ∈ D|ρ(η(z), η(y)) < }. The idea behind ABC is that the summary statistics coupled with a small tolerance should provide a good approximation of the posterior distribution: π (θ|y) = π (θ, z|y)dz ≈ π(θ|η(y)) . [Not garanteed!]
  • 14. Semi-automatic ABCA Discussion Summary statistic selection Summary statistic selection Approximate Bayesian computation (recap) Summary statistic selection F&P’s setting Noisy ABC Optimal summary statistic
  • 15. Semi-automatic ABCA Discussion Summary statistic selection F&P’s setting F&P’s ABC Use of a summary statistic S(·), an importance proposal g(·), a kernel K(·) ≤ 1 and a bandwidth h > 0 such that (θ, ysim ) ∼ g(θ)f (ysim |θ) is accepted with probability (hence the bound) K[{S(ysim ) − sobs }/h] [or is it K[{S(ysim ) − sobs }]/h, cf (2)? typo] the corresponding importance weight defined by π(θ) g(θ)
  • 16. Semi-automatic ABCA Discussion Summary statistic selection F&P’s setting Errors, errors, and errors Three levels of approximation π(θ|yobs ) by π(θ|sobs ) loss of information π(θ|sobs ) by π(s)K[{s − sobs }/h]π(θ|s) ds πABC (θ|sobs ) = π(s)K[{s − sobs }/h] ds noisy observations πABC (θ|sobs ) by importance Monte Carlo based on N simulations, represented by var(a(θ)|sobs )/Nacc [expected number of acceptances]
  • 17. Semi-automatic ABCA Discussion Summary statistic selection F&P’s setting Average acceptance asymptotics For the average acceptance probability/approximate likelihood p(θ|sobs ) = f (ysim |θ) K[{S(ysim ) − sobs }/h] dysim , overall acceptance probability p(sobs ) = p(θ|sobs ) π(θ) dθ = π(sobs )hd + o(hd ) [Lemma 1]
  • 18. Semi-automatic ABCA Discussion Summary statistic selection F&P’s setting Optimal importance proposal Best choice of importance proposal in terms of effective sample size g (θ|sobs ) ∝ π(θ)p(θ|sobs )1/2 [Not particularly useful in practice]
  • 19. Semi-automatic ABCA Discussion Summary statistic selection F&P’s setting Calibration of h “This result gives insight into how S(·) and h affect the Monte Carlo error. To minimize Monte Carlo error, we need hd to be not too small. Thus ideally we want S(·) to be a low dimensional summary of the data that is sufficiently informative about θ that π(θ|sobs ) is close, in some sense, to π(θ|yobs )” (p.5) Constraint on h only addresses one term in the approximation error and acceptance probability h large prevents π(θ|sobs ) to be close to πABC (θ|sobs ) d small prevents π(θ|sobs ) to be close to π(θ|yobs )
  • 20. Semi-automatic ABCA Discussion Summary statistic selection Noisy ABC Calibrated ABC Definition For 0 < q < 1 and subset A, fix [one specific?/all?] event Eq (A) with PrABC (θ ∈ Eq (A)|sobs ) = q. Then ABC is calibrated if Pr(θ ∈ A|Eq (A)) = q Why calibrated and not exact?
  • 21. Semi-automatic ABCA Discussion Summary statistic selection Noisy ABC Calibrated ABC Theorem Noisy ABC, where sobs = S(yobs ) + h , ∼ K(·) is calibrated [Wilkinson, 2008] no condition on h
  • 22. Semi-automatic ABCA Discussion Summary statistic selection Noisy ABC Calibrated ABC Theorem For noisy ABC, the expected noisy-ABC log-likelihood, E {log[p(θ|sobs )]} = log[p(θ|S(yobs ) + h )]π(yobs |θ0 )K( )dyobs dx, has its maximum at θ = θ0 . [Last line of proof contains a typo] True for any choice of summary statistic? [Imposes at least identifiability...] Relevant in asymptotia and not for the data
  • 23. Semi-automatic ABCA Discussion Summary statistic selection Noisy ABC Calibrated ABC Corollary For noisy ABC, the ABC posterior converges onto a point mass on the true parameter value as m → ∞. For standard ABC, not always the case (unless h goes to zero). Strength of regularity conditions (c1) and (c2) in Bernardo & Smith, 1994? [constraints on posterior] Some condition upon summary statistic?
  • 24. Semi-automatic ABCA Discussion Summary statistic selection Optimal summary statistic Loss motivated statistic Under quadratic loss function, Theorem ˆ (i) The minimal posterior error E[L(θ, θ)|yobs ] occurs when ˆ = E(θ|yobs ) (!) θ (ii) When h → 0, EABC (θ|sobs ) converges to E(θ|yobs ) ˆ iii If S(yobs ) = E[θ|yobs ] then for θ = EABC [θ|sobs ] ˆ E[L(θ, θ)|yobs ] = trace(AΣ) + h2 xT AxK(x)dx + o(h2 ). Measure-theoretic difficulties? dependence of sobs on h makes me uncomfortable Relevant for choice of K?
  • 25. Semi-automatic ABCA Discussion Summary statistic selection Optimal summary statistic Optimal summary statistic “We take a different approach, and weaken the requirement for πABC to be a good approximation to π(θ|yobs ). We argue for πABC to be a good approximation solely in terms of the accuracy of certain estimates of the parameters.” (p.5) From this result, F&P derive their choice of summary statistic, S(y) = E(θ|y) [almost sufficient] suggest h = O(N −1/(2+d) ) and h = O(N −1/(4+d) ) as optimal bandwidths for noisy and standard ABC.
  • 26. Semi-automatic ABCA Discussion Summary statistic selection Optimal summary statistic Optimal summary statistic “We take a different approach, and weaken the requirement for πABC to be a good approximation to π(θ|yobs ). We argue for πABC to be a good approximation solely in terms of the accuracy of certain estimates of the parameters.” (p.5) From this result, F&P derive their choice of summary statistic, S(y) = E(θ|y) [EABC [θ|S(yobs )] = E[θ|yobs ]] suggest h = O(N −1/(2+d) ) and h = O(N −1/(4+d) ) as optimal bandwidths for noisy and standard ABC.
  • 27. Semi-automatic ABCA Discussion Summary statistic selection Optimal summary statistic Caveat Since E(θ|yobs ) is most usually unavailable, F&P suggest (i) use a pilot run of ABC to determine a region of non-negligible posterior mass; (ii) simulate sets of parameter values and data; (iii) use the simulated sets of parameter values and data to estimate the summary statistic; and (iv) run ABC with this choice of summary statistic.
  • 28. Semi-automatic ABCA Discussion Summary statistic selection Optimal summary statistic Approximating the summary statistic As Beaumont et al. (2002) and Blum and Fran¸ois (2010), F&P c use a linear regression to approximate E(θ|yobs ): (i) θi = β0 + β (i) f (yobs ) + i
  • 29. Semi-automatic ABCA Discussion Summary statistic selection Optimal summary statistic Applications The paper’s second half covers: g-and-k-distribution stochastic kinetic biochemical networks LotkaVolterra model Ricker map ecological model M/G/1-queue tuberculosis bacteria genotype data
  • 30. Semi-automatic ABCA Discussion Summary statistic selection Optimal summary statistic Questions dependence on h and S(·) in the early stage reduction of Bayesian inference to point estimation approximation error in step (iii) not accounted for not parameterisation invariant practice shows that proper approximation to genuine posterior distributions stems from using a (much) larger number of summary statistics than the dimension of the parameter the validity of the approximation to the optimal summary statistic depends on the quality of the pilot run; important inferential issues like model choice are not covered by this approach.