SlideShare uma empresa Scribd logo
Faster Hamiltonian Monte Carlo by Learning
Leapfrog Scale
Changye Wu, Julien Stoehr and Christian P. Robert.
Université Paris-Dauphine, Université PSL, CNRS, CEREMADE, 75016 PARIS, FRANCE
Abstract
Hamiltonian Monte Carlo samplers have become standard algo-
rithms for MCMC implementations, as opposed to more basic
versions, but they still require some amount of tuning and cali-
bration. Exploiting the U-turn criterion of the NUTS algorithm
[2], we propose a version of HMC that relies on the distribution
of the integration time of the associated leapfrog integrator.
Using in addition the primal-dual averaging method for tuning
the step size of the integrator, we achieve an essentially calibra-
tion free version of HMC. When compared with the original
NUTS on benchmarks, this algorithm exhibits a significantly
improved efficiency.
Hamiltonian Monte Carlo (HMC, [3])
Consider a density π on Θ ⊂ Rd
with respect to the Lebesgue
measure,
π(θ) ∝ exp{−U(θ)}, where U ∈ C1
(Θ).
Aim: generate a Markov chain (θ1, . . . , θN) with invariant dis-
tribution π to estimate, for some function h, functionals with
respect to π,
1
N
N
n=1
h(θn)
a.s.
−→
N→+∞ Θ
h(θ)π(dθ).
Principle: sample from an augmented target distribution
π(θ, v) = π(θ)N(v | 0, M) ∝ exp {−H(θ, v)} .
• auxiliary variable v ∈ Rd
referred to as momentum variable
as opposed to θ referred to as position,
• marginal chain in θ is the distribution of interest.
Hamiltonian dynamics: generating proposals for (θ, v) based
on



dθ
dt
=
∂H
∂v
= M−1
v
dv
dt
= −
∂H
∂θ
= − U(θ).
⊕ leaves π invariant, allows
large moves,
requires the solution flow
to the differential equa-
tions.
(θ, v)
(θ , v )
Leapfrog integrator: second order symplectic integrator
which yields an approximate solution flow by iterating the fol-
lowing procedure from (θ0, v0) = (θ, v)



r = vn − /2 U(θn),
θn+1 = θn + M−1
r,
vn+1 = r − /2 U(θn + 1).
• : a discretisation time-step.
• L: a number of leapfrog steps
solution at a time t = L
(θ, v)
(θ , v )
This scheme does no longer leave the measure π invariant!
Correction: an accept-reject step is introduced. A transition
from (θ, v) to proposal θ , −v is accepted with probability
ρ θ, v, θ , v = 1 ∧ exp H(θ, v) − H θ , −v .
Pros & cons:
⊕ The algorithm theoretically benefits from a fast explo-
ration of the parameter space by accepting large transi-
tions with high probability.
High sensitivity to hand-tuned parameters, namely the
step size of the discretisation scheme, the number of
steps L of the integrator, and the covariance matrix M.
The No-U-Turn Sampler (NUTS, [2])
Idea: version of HMC sampler that eliminates the need to
specify the number L by adaptively choosing the locally largest
value at each iteration of the algorithm.
How? Doubling the leapfrog path, either forward or back-
ward with equal probability, until the backward and the for-
ward end points of the path, (θ−
, v−
) and (θ+
, v+
), satisfy
(θ+
− θ−
) · M−1
v−
< 0
or
(θ+
− θ−
) · M−1
v+
< 0.
(θ, v)
(θ+
, v+
)
(θ−
, v−
)
Proposal: sampling along the generated trajectory
• slice sampling [2]
• multinomial sampling ([1], version implemented in
Stan).
What about ? Tuned via primal-dual averaging [4], by aim-
ing at a targeted acceptance probability δ0 ∈ (0, 1).
Numerical experiment: Susceptible-Infected-Recovered Model (SIR)
SIR: model used to represent disease transmission, for epidemics like cholera, within a population
ηk ∼ Poisson(ytk−1,1 − ytk,1) and ˆBk ∼ LN(ytk,4, 0.152
), where



k = 1, · · · , Nt = 20,
tk = 7k.
The dynamic of yt ∈ R4
is
dyt,1
dt
= −
βyt,4
yt,4 + κ0
yt,1,
dyt,2
dt
=
βyt,4
yt,4 + κ0
yt,1 − γyt,2,
dyt,3
dt
= γyt,3,
dyt,4
dt
= ξyt,2 − φyt,4,
Interpretation:
• yt,1, yt,2, yt,3 : no. of susceptible, infected, and recovered people
within the community.
• yt,4: concentration of the virus in the water reservoir.
• ηk: size of the pop. that becomes infected during [tk−1, tk].
Assumptions:
• the size of the population yt,1 +yt,2 +yt,3 is constant,
• β ∼ C(0, 2.5), γ ∼ C(0, 1),
ξ ∼ C(0, 25), φ ∼ C(0, 1),
Observed dataset: https://github.com/stan-dev/stat_comp_benchmarks/tree/master/benchmarks/sir.
q
q
qq
qq
qqq
qqqq
q
qqqqq
q
q
q
qq
q
q
q
q
qqqqqq
q
q
q
qq
q
qq
q
q
q
q
qqq
q
qqq
q
qqq
q
qq
q
qq
q
qqqqq
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
q
qqq
qqq
q
q
q
q
qqqqqqq
qqq q
qqqq
qq
q
qq
q
q
qqq
qq
q
qqqqqq
q
qqq
q
q
q
qq
q
qqq
q
qq qq
qqq
q
q
q
qqqqqqq
q
q
q
qqqq
q
qqq
q
qqq
q
qqq
qqqqqq qqqqqqqqqqqqqqqqqq
q
q
q
q
qqqq
q
qqqq
qq
q
qqq
q
q
q
q
q
q
qq
q
q
q
qq
q
q
qqq
q
qqqqq
q
q
qq
q
q
q
q
q
qqqqq
qq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
qq
q
qq
qq
q
qq
qq
q
q
qqq
qq
qq
q
q
q
q
qqq
qqqq
q
q
q
q
qq
qqqq
q
qq
q
qq
q
q
q
q
qqq
qq
q
q
q
qqq
q
q
q
q
q
q
q
q
qqq
q
qqq
q
q
q
qq
q
q
qqq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
qqq
qqq
qqq
q
qq
q
q
qq
q
q
q
q
qq
q
q
q
q
q
qq
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq qq
q
qq
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
qq
q
q
q
qqq
q
qq
q
q
q
q
q
qqq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
qqq
qq
q
q
q
q
q
q
NUTS eHMC
0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9
0.00
0.01
0.02
ESJDpergradient
qq
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
qq
q
q
qqq
q
qq
qqq
qqqq
q
q
q
q
q
q
qqqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqqq
q
q
q
qq
q
q
q
q
q
q
qq
qq
qqq
q
q
q
qq
q
qq
qq
q
qq
q
q
q
q
qq
q
q
qq
q
qq
q
qqq
q
q
qq
q
q
q
q
qq
q
qq
q
qqqqq
q
q
qq
q
q
qqq
q
q
qq
q
qq
q
q
qq
q
q
q
qq
q
q
q
qqq
q
q
q
q
q
q
qqq
q
qq
q
q
qq
q
q
q
qqq
q
q
qq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
qqq
q
qqq
qqq
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
qqqqq
q
q
q
q
qq
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
qq
q
qq
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
qq
qq
qq
q
qqqqq
q
q
q
qq
q
q
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
qq
q
qq
q
qq
q
qqq
q
q
qqq
q
qqq
qq
q
q
qq
q
q
qq
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
qq
q
qqq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
qqq
q
q
q
q
qqq
q
qq
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
qqqqq
q
q
q
qq
q
qqq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqqq
q
qq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqqq
q
q
q
q
q
qqq
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
qq
q
qqq
qq
qq
qqqqqqqq
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
q
qqq
qq
qq
q
q
q
q
q
q
qqq
q
q
q
q
qq
q
q
qqq
q
q
qq
qqq
q
q
qq
qq
q
qqqqq
q
q
q
q
q
qqq
q
q
q
q
q
q
qq
qq
q
q
q
q
q
qq
q
qqqqq
q
q
q
q
q
q
q
qq
q
qq
q
q
qq
q
q
q
qqq
q
q
qq
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
q
qqqq
q
q
q
qq
q
q
q
qq
q
q
qqqqq
q
q
q
qq
q
q
qqq
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
qq
qq
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q qqq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
qq
q
qq
q
qq
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
qqq
qq
qq
q
q
q
q
q
q
q
q qq
q
qq
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
qq
qq
q
q
q
qq qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qqq
qqq
q
q
q
qqqqq
qq
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
qq
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqq
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
qq
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
qq
q
qq
q
q
q
q
q
q
q
q
q
NUTS (xi) eHMC (xi) NUTS (phi) eHMC (phi)
NUTS (beta) eHMC (beta) NUTS (gamma) eHMC (gamma)
0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9
0.005
0.010
0.015
0.020
0.025
0.005
0.010
0.015
0.020
0.025
Targeted acceptance probability δ
ESSpergradient
Empirical HMC (eHMC, [5])
Longest batch associated with (θ, v, ):
L (θ, v) = inf ∈ N (θ − θ) · M−1
v < 0 ,
where (θ , v ) is the value of the pair after iterations of the
leapfrog integrator.
Learning leapfrog scale: tuning phase with the optimised
step size and an initial number of leapfrog steps L0. At each
iteration, one
1. iterates L0 leapfrog steps to generate the next state of the
Markov chain.
2. computes the longest batch for the current state of the chain.
Output: empirical distribution of the longest batches
ˆPL =
1
K
K−1
k=0
δ L (θk, v(k+1)
) .
eHMC: randomly pick a number of leapfrog steps accord-
ing to the empirical distribution ˆPL at each iteration of HMC
algorithm.
⇒ valid: the resulting transition kernel can be seen as a com-
position of multiple Markov kernels attached to the same
stationary distribution π.
References
[1] M. Betancourt A conceptual introduction to Hamiltonian Monte Carlo. arXiv preprint arXiv:1701.02434, 2017.
[2] M. D. Hoffmand and A. Gelman. The No-U-Turn Sampler: adaptively setting path lengths in Hamiltonian Monte Carlo. Journal of Machine Learning Research, 15(1):1593–1623, 2014.
[3] R. M. Neal. MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, Chapter 5, Chapman & Hall / CRC Press, 2011.
[4] Y. Nesterov. Primal-dual subgradient methods for convex problems. Mathematical Programming, 120(1):221–259, 2009.
[5] C. Wu, J. Stoehr and C. P. Robert. Faster Hamiltonian Monte Carlo by Learning Leapfrog Scale. arXiv preprint arXiv:1810.04449v1, 2018.

Mais conteúdo relacionado

Mais procurados

Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010
Christian Robert
 
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithmsRao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Christian Robert
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-Likelihoods
Stefano Cabras
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear models
Caleb (Shiqiang) Jin
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Christian Robert
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]
Christian Robert
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
Valentin De Bortoli
 
ABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified modelsABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified models
Christian Robert
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
Christian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
Christian Robert
 
Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big Data
Christian Robert
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017
Christian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
Christian Robert
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
Christian Robert
 
RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010
Christian Robert
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distances
Christian Robert
 
Nested sampling
Nested samplingNested sampling
Nested sampling
Christian Robert
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmann
olli0601
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methods
Christian Robert
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
Christian Robert
 

Mais procurados (20)

Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010
 
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithmsRao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-Likelihoods
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear models
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
ABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified modelsABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified models
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big Data
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distances
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmann
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methods
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 

Semelhante a Poster for Bayesian Statistics in the Big Data Era conference

Project Paper
Project PaperProject Paper
Project Paper
Brian Whetter
 
fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920
fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920
fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920
Karl Rudeen
 
The tau-leap method for simulating stochastic kinetic models
The tau-leap method for simulating stochastic kinetic modelsThe tau-leap method for simulating stochastic kinetic models
The tau-leap method for simulating stochastic kinetic models
Colin Gillespie
 
Bayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic ModelsBayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic Models
Colin Gillespie
 
DSP_FOEHU - MATLAB 02 - The Discrete-time Fourier Analysis
DSP_FOEHU - MATLAB 02 - The Discrete-time Fourier AnalysisDSP_FOEHU - MATLAB 02 - The Discrete-time Fourier Analysis
DSP_FOEHU - MATLAB 02 - The Discrete-time Fourier Analysis
Amr E. Mohamed
 
intro
introintro
Laplace transform
Laplace transformLaplace transform
Laplace transform
Rodrigo Adasme Aguilera
 
Uncertain_Henry_problem-poster.pdf
Uncertain_Henry_problem-poster.pdfUncertain_Henry_problem-poster.pdf
Uncertain_Henry_problem-poster.pdf
Alexander Litvinenko
 
The moving bottleneck problem: a Hamilton-Jacobi approach
The moving bottleneck problem: a Hamilton-Jacobi approachThe moving bottleneck problem: a Hamilton-Jacobi approach
The moving bottleneck problem: a Hamilton-Jacobi approach
Guillaume Costeseque
 
pRO
pROpRO
14th Athens Colloquium on Algorithms and Complexity (ACAC19)
14th Athens Colloquium on Algorithms and Complexity (ACAC19)14th Athens Colloquium on Algorithms and Complexity (ACAC19)
14th Athens Colloquium on Algorithms and Complexity (ACAC19)
Apostolos Chalkis
 
litvinenko_Henry_Intrusion_Hong-Kong_2024.pdf
litvinenko_Henry_Intrusion_Hong-Kong_2024.pdflitvinenko_Henry_Intrusion_Hong-Kong_2024.pdf
litvinenko_Henry_Intrusion_Hong-Kong_2024.pdf
Alexander Litvinenko
 
litvinenko_Intrusion_Bari_2023.pdf
litvinenko_Intrusion_Bari_2023.pdflitvinenko_Intrusion_Bari_2023.pdf
litvinenko_Intrusion_Bari_2023.pdf
Alexander Litvinenko
 
Bayesian phylogenetic inference_big4_ws_2016-10-10
Bayesian phylogenetic inference_big4_ws_2016-10-10Bayesian phylogenetic inference_big4_ws_2016-10-10
Bayesian phylogenetic inference_big4_ws_2016-10-10
FredrikRonquist
 
HMC and NUTS
HMC and NUTSHMC and NUTS
HMC and NUTS
Marco Banterle
 
poster2
poster2poster2
poster2
Ryan Grove
 
litvinenko_Gamm2023.pdf
litvinenko_Gamm2023.pdflitvinenko_Gamm2023.pdf
litvinenko_Gamm2023.pdf
Alexander Litvinenko
 
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and SystemsDSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
Amr E. Mohamed
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...
Asma Ben Slimene
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...
Asma Ben Slimene
 

Semelhante a Poster for Bayesian Statistics in the Big Data Era conference (20)

Project Paper
Project PaperProject Paper
Project Paper
 
fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920
fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920
fb69b412-97cb-4e8d-8a28-574c09557d35-160618025920
 
The tau-leap method for simulating stochastic kinetic models
The tau-leap method for simulating stochastic kinetic modelsThe tau-leap method for simulating stochastic kinetic models
The tau-leap method for simulating stochastic kinetic models
 
Bayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic ModelsBayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic Models
 
DSP_FOEHU - MATLAB 02 - The Discrete-time Fourier Analysis
DSP_FOEHU - MATLAB 02 - The Discrete-time Fourier AnalysisDSP_FOEHU - MATLAB 02 - The Discrete-time Fourier Analysis
DSP_FOEHU - MATLAB 02 - The Discrete-time Fourier Analysis
 
intro
introintro
intro
 
Laplace transform
Laplace transformLaplace transform
Laplace transform
 
Uncertain_Henry_problem-poster.pdf
Uncertain_Henry_problem-poster.pdfUncertain_Henry_problem-poster.pdf
Uncertain_Henry_problem-poster.pdf
 
The moving bottleneck problem: a Hamilton-Jacobi approach
The moving bottleneck problem: a Hamilton-Jacobi approachThe moving bottleneck problem: a Hamilton-Jacobi approach
The moving bottleneck problem: a Hamilton-Jacobi approach
 
pRO
pROpRO
pRO
 
14th Athens Colloquium on Algorithms and Complexity (ACAC19)
14th Athens Colloquium on Algorithms and Complexity (ACAC19)14th Athens Colloquium on Algorithms and Complexity (ACAC19)
14th Athens Colloquium on Algorithms and Complexity (ACAC19)
 
litvinenko_Henry_Intrusion_Hong-Kong_2024.pdf
litvinenko_Henry_Intrusion_Hong-Kong_2024.pdflitvinenko_Henry_Intrusion_Hong-Kong_2024.pdf
litvinenko_Henry_Intrusion_Hong-Kong_2024.pdf
 
litvinenko_Intrusion_Bari_2023.pdf
litvinenko_Intrusion_Bari_2023.pdflitvinenko_Intrusion_Bari_2023.pdf
litvinenko_Intrusion_Bari_2023.pdf
 
Bayesian phylogenetic inference_big4_ws_2016-10-10
Bayesian phylogenetic inference_big4_ws_2016-10-10Bayesian phylogenetic inference_big4_ws_2016-10-10
Bayesian phylogenetic inference_big4_ws_2016-10-10
 
HMC and NUTS
HMC and NUTSHMC and NUTS
HMC and NUTS
 
poster2
poster2poster2
poster2
 
litvinenko_Gamm2023.pdf
litvinenko_Gamm2023.pdflitvinenko_Gamm2023.pdf
litvinenko_Gamm2023.pdf
 
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and SystemsDSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...
 

Mais de Christian Robert

Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte Carlo
Christian Robert
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
Christian Robert
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
Christian Robert
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
Christian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
Christian Robert
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
Christian Robert
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
Christian Robert
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
Christian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
Christian Robert
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
Christian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
Christian Robert
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Christian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
Christian Robert
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
Christian Robert
 
prior selection for mixture estimation
prior selection for mixture estimationprior selection for mixture estimation
prior selection for mixture estimation
Christian Robert
 
better together? statistical learning in models made of modules
better together? statistical learning in models made of modulesbetter together? statistical learning in models made of modules
better together? statistical learning in models made of modules
Christian Robert
 

Mais de Christian Robert (18)

Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte Carlo
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
 
prior selection for mixture estimation
prior selection for mixture estimationprior selection for mixture estimation
prior selection for mixture estimation
 
better together? statistical learning in models made of modules
better together? statistical learning in models made of modulesbetter together? statistical learning in models made of modules
better together? statistical learning in models made of modules
 

Último

Sciences of Europe journal No 142 (2024)
Sciences of Europe journal No 142 (2024)Sciences of Europe journal No 142 (2024)
Sciences of Europe journal No 142 (2024)
Sciences of Europe
 
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
hozt8xgk
 
Direct Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart AgricultureDirect Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart Agriculture
International Food Policy Research Institute- South Asia Office
 
Immersive Learning That Works: Research Grounding and Paths Forward
Immersive Learning That Works: Research Grounding and Paths ForwardImmersive Learning That Works: Research Grounding and Paths Forward
Immersive Learning That Works: Research Grounding and Paths Forward
Leonel Morgado
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
İsa Badur
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
pablovgd
 
Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...
Leonel Morgado
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
Sérgio Sacani
 
The cost of acquiring information by natural selection
The cost of acquiring information by natural selectionThe cost of acquiring information by natural selection
The cost of acquiring information by natural selection
Carl Bergstrom
 
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Leonel Morgado
 
Basics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different formsBasics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different forms
MaheshaNanjegowda
 
Compexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titrationCompexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titration
Vandana Devesh Sharma
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
Sérgio Sacani
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
Sérgio Sacani
 
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
vluwdy49
 
Thornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdfThornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdf
European Sustainable Phosphorus Platform
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
PRIYANKA PATEL
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
muralinath2
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
Gokturk Mehmet Dilci
 
Bob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdfBob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdf
Texas Alliance of Groundwater Districts
 

Último (20)

Sciences of Europe journal No 142 (2024)
Sciences of Europe journal No 142 (2024)Sciences of Europe journal No 142 (2024)
Sciences of Europe journal No 142 (2024)
 
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
 
Direct Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart AgricultureDirect Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart Agriculture
 
Immersive Learning That Works: Research Grounding and Paths Forward
Immersive Learning That Works: Research Grounding and Paths ForwardImmersive Learning That Works: Research Grounding and Paths Forward
Immersive Learning That Works: Research Grounding and Paths Forward
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
 
Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
 
The cost of acquiring information by natural selection
The cost of acquiring information by natural selectionThe cost of acquiring information by natural selection
The cost of acquiring information by natural selection
 
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
 
Basics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different formsBasics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different forms
 
Compexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titrationCompexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titration
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
 
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
 
Thornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdfThornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdf
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
 
Bob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdfBob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdf
 

Poster for Bayesian Statistics in the Big Data Era conference

  • 1. Faster Hamiltonian Monte Carlo by Learning Leapfrog Scale Changye Wu, Julien Stoehr and Christian P. Robert. Université Paris-Dauphine, Université PSL, CNRS, CEREMADE, 75016 PARIS, FRANCE Abstract Hamiltonian Monte Carlo samplers have become standard algo- rithms for MCMC implementations, as opposed to more basic versions, but they still require some amount of tuning and cali- bration. Exploiting the U-turn criterion of the NUTS algorithm [2], we propose a version of HMC that relies on the distribution of the integration time of the associated leapfrog integrator. Using in addition the primal-dual averaging method for tuning the step size of the integrator, we achieve an essentially calibra- tion free version of HMC. When compared with the original NUTS on benchmarks, this algorithm exhibits a significantly improved efficiency. Hamiltonian Monte Carlo (HMC, [3]) Consider a density π on Θ ⊂ Rd with respect to the Lebesgue measure, π(θ) ∝ exp{−U(θ)}, where U ∈ C1 (Θ). Aim: generate a Markov chain (θ1, . . . , θN) with invariant dis- tribution π to estimate, for some function h, functionals with respect to π, 1 N N n=1 h(θn) a.s. −→ N→+∞ Θ h(θ)π(dθ). Principle: sample from an augmented target distribution π(θ, v) = π(θ)N(v | 0, M) ∝ exp {−H(θ, v)} . • auxiliary variable v ∈ Rd referred to as momentum variable as opposed to θ referred to as position, • marginal chain in θ is the distribution of interest. Hamiltonian dynamics: generating proposals for (θ, v) based on    dθ dt = ∂H ∂v = M−1 v dv dt = − ∂H ∂θ = − U(θ). ⊕ leaves π invariant, allows large moves, requires the solution flow to the differential equa- tions. (θ, v) (θ , v ) Leapfrog integrator: second order symplectic integrator which yields an approximate solution flow by iterating the fol- lowing procedure from (θ0, v0) = (θ, v)    r = vn − /2 U(θn), θn+1 = θn + M−1 r, vn+1 = r − /2 U(θn + 1). • : a discretisation time-step. • L: a number of leapfrog steps solution at a time t = L (θ, v) (θ , v ) This scheme does no longer leave the measure π invariant! Correction: an accept-reject step is introduced. A transition from (θ, v) to proposal θ , −v is accepted with probability ρ θ, v, θ , v = 1 ∧ exp H(θ, v) − H θ , −v . Pros & cons: ⊕ The algorithm theoretically benefits from a fast explo- ration of the parameter space by accepting large transi- tions with high probability. High sensitivity to hand-tuned parameters, namely the step size of the discretisation scheme, the number of steps L of the integrator, and the covariance matrix M. The No-U-Turn Sampler (NUTS, [2]) Idea: version of HMC sampler that eliminates the need to specify the number L by adaptively choosing the locally largest value at each iteration of the algorithm. How? Doubling the leapfrog path, either forward or back- ward with equal probability, until the backward and the for- ward end points of the path, (θ− , v− ) and (θ+ , v+ ), satisfy (θ+ − θ− ) · M−1 v− < 0 or (θ+ − θ− ) · M−1 v+ < 0. (θ, v) (θ+ , v+ ) (θ− , v− ) Proposal: sampling along the generated trajectory • slice sampling [2] • multinomial sampling ([1], version implemented in Stan). What about ? Tuned via primal-dual averaging [4], by aim- ing at a targeted acceptance probability δ0 ∈ (0, 1). Numerical experiment: Susceptible-Infected-Recovered Model (SIR) SIR: model used to represent disease transmission, for epidemics like cholera, within a population ηk ∼ Poisson(ytk−1,1 − ytk,1) and ˆBk ∼ LN(ytk,4, 0.152 ), where    k = 1, · · · , Nt = 20, tk = 7k. The dynamic of yt ∈ R4 is dyt,1 dt = − βyt,4 yt,4 + κ0 yt,1, dyt,2 dt = βyt,4 yt,4 + κ0 yt,1 − γyt,2, dyt,3 dt = γyt,3, dyt,4 dt = ξyt,2 − φyt,4, Interpretation: • yt,1, yt,2, yt,3 : no. of susceptible, infected, and recovered people within the community. • yt,4: concentration of the virus in the water reservoir. • ηk: size of the pop. that becomes infected during [tk−1, tk]. Assumptions: • the size of the population yt,1 +yt,2 +yt,3 is constant, • β ∼ C(0, 2.5), γ ∼ C(0, 1), ξ ∼ C(0, 25), φ ∼ C(0, 1), Observed dataset: https://github.com/stan-dev/stat_comp_benchmarks/tree/master/benchmarks/sir. q q qq qq qqq qqqq q qqqqq q q q qq q q q q qqqqqq q q q qq q qq q q q q qqq q qqq q qqq q qq q qq q qqqqq qq q q q q q q qq q q q q q q qq q q q q q q q q q qqq q qqq qqq q q q q qqqqqqq qqq q qqqq qq q qq q q qqq qq q qqqqqq q qqq q q q qq q qqq q qq qq qqq q q q qqqqqqq q q q qqqq q qqq q qqq q qqq qqqqqq qqqqqqqqqqqqqqqqqq q q q q qqqq q qqqq qq q qqq q q q q q q qq q q q qq q q qqq q qqqqq q q qq q q q q q qqqqq qq q q q q q qq q q q q qq q q q q q qq q qq qq q qq qq q q qqq qq qq q q q q qqq qqqq q q q q qq qqqq q qq q qq q q q q qqq qq q q q qqq q q q q q q q q qqq q qqq q q q qq q q qqq q q qqq q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q qq q q q qqq qqq qqq q qq q q qq q q q q qq q q q q q qq q q qq qq q q q q q q q q q q q qqq q qq qq q qq q q q q qqqq q q q q q q q q qq qq q q q q q q q q q q q qq q q q q q q qq q qq q q q qqq q qq q q q q q qqq q q qqq q q q q q q q q q q q q q q q q q q q q q qq q q q qq qq q q q q q q q q q q q q q qq q q q q q q qq q q q q qq q q q q q q q qq qq q q qq q q q q qqq qq q q q q q q NUTS eHMC 0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9 0.00 0.01 0.02 ESJDpergradient qq q q q q q q q qq q q qq q q qq q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q qqq q q q q q q q q q q q q qq q q q q q q q q qq qq q q qqq q qq qqq qqqq q q q q q q qqqqq q q q q q q q q q q q q q q q q q q qqqqq q q q qq q q q q q q qq qq qqq q q q qq q qq qq q qq q q q q qq q q qq q qq q qqq q q qq q q q q qq q qq q qqqqq q q qq q q qqq q q qq q qq q q qq q q q qq q q q qqq q q q q q q qqq q qq q q qq q q q qqq q q qq q q q q q q qqq q q q q q q q q qq qq q q q q q q q qqq q qqq qqq q q q qq q qq q q q qq q q q q q q q qq q q q q qq q q q q q q qqqqq q q q q qq q qq q q q q q qq q q q q q q q q q qq qq q qq q q q q qqq q q q q q q q q q q q qq q q q q q q qq q q q q qqqq q q q q q q q q q q q qqqq q q q q q q qq q q q qq q q q q q q q q qq qq qq q qqqqq q q q qq q q q q q q q qq qq q q q qq q q q q q qq q qq q qq q qqq q q qqq q qqq qq q q qq q q qq qq qq q q q q q q q q q qq q qq qq q qqq q q q q q q qqq q q q q q q q q q q q q q q q q q qqq q qq q q qqq q q q q qqq q qq q q q q q q q q qq qq q q q q qqqqq q q q qq q qqq q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q qq q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qqqq q qq q q q q q q qqq q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q qqq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq qq q q q q q q q q q q q q q q q qqq q qq q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qqqq q q q q q qqq q qq q q q q q q qq q q q q q q q q qq qq q qqq qq qq qqqqqqqq q q q q q qq q qq q q q q q q q q q q q q qqqq q q q qqqq q q q q q q q q q q q qq q q qqq q q q qqq qq qq q q q q q q qqq q q q q qq q q qqq q q qq qqq q q qq qq q qqqqq q q q q q qqq q q q q q q qq qq q q q q q qq q qqqqq q q q q q q q qq q qq q q qq q q q qqq q q qq q q qq q q qqq q q q q q q q q qqqq q q q qq q q q qq q q qqqqq q q q qq q q qqq q qq q q q q q q q q q q qq q q q qq q q qq q q q q q q q q q q qq q q qqq q q qq q q q q q q q q q q q q q q q q qqq q qq qq q q q q q q qq q q q q q qq q q q q q qqq q q q q q q q q q q q q qq q q qq q q q q qq qq q q qq q q q q qq q q q q q q q q q q q q q q q qq qqqq q q q q q q q q q q q q q q qqq q q q qq q qq q qq q q q q qqqq q q q q q q q q q qqq qq qq q q q q q q q q qq q qq qq q q q q qq q q q q q q q q q q qqq q q q q q qq qq q q q qq qq q qq q q q q q q q q q q q q q qq qqq qqq q q q qqqqq qq qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q qq q q q q qqq q q q q q q q q q q q q q q qq q q q q qq q q qq q q q q q qq q q q q q q q qq qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q qqq q q q q q q q q q q q qq q q q q q q q q q q q q q q qq qq q q q q q q q q qq q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q qqq q q q q q qq q q q q q q qq q q q q q qq q q q q q q q q q qq q q q qqq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq qq q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q qq q q q q q q qq q q q qq q q q q q qq q q q qq q q q qq q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq qq q q q q q q q qq q q q q q q q qq qq q qq q q q q q q q q q NUTS (xi) eHMC (xi) NUTS (phi) eHMC (phi) NUTS (beta) eHMC (beta) NUTS (gamma) eHMC (gamma) 0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9 0.6 0.7 0.8 0.9 0.005 0.010 0.015 0.020 0.025 0.005 0.010 0.015 0.020 0.025 Targeted acceptance probability δ ESSpergradient Empirical HMC (eHMC, [5]) Longest batch associated with (θ, v, ): L (θ, v) = inf ∈ N (θ − θ) · M−1 v < 0 , where (θ , v ) is the value of the pair after iterations of the leapfrog integrator. Learning leapfrog scale: tuning phase with the optimised step size and an initial number of leapfrog steps L0. At each iteration, one 1. iterates L0 leapfrog steps to generate the next state of the Markov chain. 2. computes the longest batch for the current state of the chain. Output: empirical distribution of the longest batches ˆPL = 1 K K−1 k=0 δ L (θk, v(k+1) ) . eHMC: randomly pick a number of leapfrog steps accord- ing to the empirical distribution ˆPL at each iteration of HMC algorithm. ⇒ valid: the resulting transition kernel can be seen as a com- position of multiple Markov kernels attached to the same stationary distribution π. References [1] M. Betancourt A conceptual introduction to Hamiltonian Monte Carlo. arXiv preprint arXiv:1701.02434, 2017. [2] M. D. Hoffmand and A. Gelman. The No-U-Turn Sampler: adaptively setting path lengths in Hamiltonian Monte Carlo. Journal of Machine Learning Research, 15(1):1593–1623, 2014. [3] R. M. Neal. MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, Chapter 5, Chapman & Hall / CRC Press, 2011. [4] Y. Nesterov. Primal-dual subgradient methods for convex problems. Mathematical Programming, 120(1):221–259, 2009. [5] C. Wu, J. Stoehr and C. P. Robert. Faster Hamiltonian Monte Carlo by Learning Leapfrog Scale. arXiv preprint arXiv:1810.04449v1, 2018.