SlideShare a Scribd company logo
1 of 54
Download to read offline
QMC tutorial 1
Quasi-Monte Carlo
A tutorial introduction
Art B. Owen
Stanford University
MCQMC 2018 will be in Rennes, July 1–8 2018
SAMSI Workshop on QMC, August 2017
QMC tutorial 2
MC and QMC and other points
Monte Carlo (MC) and quasi-Monte Carlo (QMC) methods come down to
sampling the input space of a function.
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q q q q q q q q q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
Top row, left to right
MC, grid, and two QMC methods
Top row, left to right
Two more QMC, sparse grid, blue noise
SAMSI Workshop on QMC, August 2017
QMC tutorial 3
Outline
1) What QMC is (utmost stratification)
2) Why it works (discrepancy, variation and Koksma-Hlawka)
3) How it works (constructions)
4) Randomized QMC
5) When it works best (effective dimension, tractability, weighted spaces)
(room for Bayesian thinking here)
6) (maybe) QMC ∩ MCMC
7) (maybe) QMC ∩ Particles
SAMSI Workshop on QMC, August 2017
QMC tutorial 4
Landmark papers in MC
Some landmark papers where Monte Carlo was applied:
• Physics Metropolis et al. (1953)
• Chemistry (reaction equations) Gillespie (1977)
• Financial valuation Boyle (1977)
• Bootstrap resampling Efron (1979)
• OR (discrete event simulation) Tocher & Owen (1960)
• Bayes (maybe 5 landmarks in early days)
• Nonsmooth optimization Kirkpatrick et al. (1983)
• Computer graphics (path tracing) Kajiya (1988)
SAMSI Workshop on QMC, August 2017
QMC tutorial 5
Landmark uses of QMC
• Particle transport methods in physics / medical imaging Jerome Spanier++
• Financial valuation, some early examples Paskov & Traub 1990s
• Graphical rendering Alex Keller++
(They got an Oscar!)
• Solving PDEs Frances Kuo, Christoph Schwab++, 2015
• Particle methods Chopin & Gerber (2015)
The next landmark methods
Some strong candidate areas:
• machine learning
• Bayes
• uncertainty quantification (UQ)
Clementine Prieur’s tutorial is in UQ.
SAMSI Workshop on QMC, August 2017
QMC tutorial 6
The next landmark
QMC methods dominate when we have to integrate a high dimensional function
that is nearly a sum of smooth functions of just a few of its input variables.
It is hard to tell a priori that this will happen. The original integrand might fail to be
smooth and yet it could be nearly a sum of smooth functions. Sloan, Kuo, Griebel
The best way is to find out is to run QMC on some example problems.
SAMSI Workshop on QMC, August 2017
QMC tutorial 7
MC and QMC
We estimate
µ =
[0,1]d
f(x) dx by ˆµ =
1
n
n
i=1
f(xi), xi ∈ [0, 1]d
In plain MC, the xi are IID U[0, 1]d
. In QMC they’re ’spread evenly’.
Non uniform
µ =
Ω
f(x)p(x) dx and ˆµ =
1
n
n
i=1
f(ψ(ui)), ui ∈ [0, 1]s
ψ : [0, 1]s
→ Rd
If u ∼ U[0, 1]s
then x = ψ(u) ∼ p
Many methods fit this framework. Devroye (1986)
Acceptance-rejection remains awkward.
SAMSI Workshop on QMC, August 2017
QMC tutorial 8
Illustration
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
Monte Carlo
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
Fibonacci lattice
MC and two QMC methods in the unit square
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
Hammersley sequence
MC points always have clusters and gaps. What is random is where they appear.
QMC points avoid clusters and gaps to the extent that mathematics permits.
SAMSI Workshop on QMC, August 2017
QMC tutorial 9
Measuring uniformity
We need a way to verify that the points xi are ‘spread out’ in [0, 1]d
.
The most fruitful way is to show that
U {x1, x2, . . . , xn}
.
= U[0, 1]d
Discrepancy
A discrepancy is a distance F − ˆFn between measures
F = U[0, 1]d
and ˆFn = U {x1, x2, . . . , xn}.
There are many discrepancies.
SAMSI Workshop on QMC, August 2017
QMC tutorial 10
Local discrepancy
Did the box [0, a) get it’s fair share of points?
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
a
0
q
0.6
0.70
b
0
q
0.42
0.45
Local discrepancy at a, b
δ(a) = ˆFn([0, a)) − F([0, a)) =
13
32
− 0.6 × 0.7 = −0.01375
Star discrepancy
D∗
n = D∗
n(x1, . . . , xn) = sup
a∈[0,1)d
|δ(a)| i.e., δ ∞
For d = 1 this is Kolmogorov-Smirnov.
SAMSI Workshop on QMC, August 2017
QMC tutorial 11
More discrepancies
D∗
n = sup
a∈[0,1)d
ˆFn([0, a)) − F([0, a))
Dn = sup
a,b∈[0,1)d
ˆFn([a, b)) − F([a, b))
D∗
n Dn 2d
D∗
n
Lp
discrepancies
D∗p
n =
[0,1)d
|δ(a)|p
da
1/p
e.g., Warnock
Also
Wrap-around discrepancies Hickernell
Discrepancies over (triangles, rotated rectangles, balls · · · convex sets · · · ).
Beck, Chen, Schmidt, Brandolini, Travaglini, Colzani, Gigante, Cools, Pillards
Best results are only for axis-aligned hyper-rectangles. SAMSI Workshop on QMC, August 2017
QMC tutorial 12
QMC’s law of large numbers
1) If f is Riemann integrable on [0, 1]d
, and
2) D∗
n(x1, . . . , xn) → 0
Then
1
n
n
i=1
f(xi) →
[0,1]d
f(x) dx
How fast?
MC has the CLT.
QMC has the Koksma-Hlawka inequality.
SAMSI Workshop on QMC, August 2017
QMC tutorial 13
Koksma’s inequality
For d = 1 |ˆµ − µ| D∗
n(x1, . . . , xn) ×
1
0
|f (x)| dx
NB: D∗
n = δ ∞ and
1
0
|f (x)| dx is the total variation V (f).
Setup for the proof
1
0
f(x) dx = f(1) −
1
0
xf (x) dx Integration by parts
1
n
n
i=1
f(xi) = f(1) −
1
n
n
i=0
i(f(xi+1) − f(xi)
=
xi+1
xi
f (x) dx
) Summation by parts
A few more steps, use f continuous
|µ − ˆµ| = · · · =
1
0
δ(x)f (x) dx δ ∞ f 1 = D∗
n × V (f)
SAMSI Workshop on QMC, August 2017
QMC tutorial 14
Koksma-Hlawka theorem
1
n
n
i=1
f(xi) −
[0,1)d
f(x) dx D∗
n × VHK(f)
VHK is the total variation in the sense of Hardy (1905) and Krause (1903)
Multidimensional variation has a few surprises for us.
Puzzler
Is this a 100% confidence interval?
SAMSI Workshop on QMC, August 2017
QMC tutorial 15
Rates of convergence
It is possible to get D∗
n = O
log(n)d−1
n
.
Then
|ˆµ − µ| = o(n−1+
) vs Op(n−1/2
) for MC
What about those logs?
Maybe log(n)d−1
/n 1/
√
n
Low effective dimension (later) counters them
As do some randomizations (later)
Roth (1954)
D∗
n = o
log(n)(d−1)/2
n
is unattainable
Gap between log(n)(d−1)/2
and log(n)d−1
subject to continued work.
E.g., Lacey, Bilyk
SAMSI Workshop on QMC, August 2017
QMC tutorial 16
Tight and loose bounds
They are not mutually exclusive.
Koksma-Hlawka is tight
|ˆµ − µ| (1 − )D∗
n(x1, . . . , xn) × VHK(f) fails for some f
KH holds as an equality for a worst case function, e.g., f
.
= ±δ.
It even holds if an adversary sees your xi before picking f.
Koksma-Hlawka is also very loose
It can greatly over-estimate actual error. Usually δ and f are dissimilar.
ˆµ − µ = − δ, f
Just like Chebychev’s inequality
It is also tight and very loose. E.g., Pr |N(0, 1)| 10 0.01 is loose.
Yes: 1.5 × 10−23
10−2
SAMSI Workshop on QMC, August 2017
QMC tutorial 17
Variation
Multidimensional Hardy-Krause variation has surprises for us. O (2005)
f(x1, x2) =



1, x1 + x2 1/2
0, else
VHK(f) = ∞ on [0, 1]2
VHK(f ) < ∞, for some f with f − f 1 <
Cusps
For general a ∈ Rd
,
f = max(aT
x, 0) =⇒ VHK(f) = ∞ for d 3
f = max(aT
x, 0)2
=⇒ VHK(f) = ∞ for d 4
QMC-friendly discontinuities
Axis parallel discontinuities may have VHK < ∞.
Used by e.g., X. Wang, I. Sloan, Z. He SAMSI Workshop on QMC, August 2017
QMC tutorial 18
Extensibility
For d = 1, the equispaced points xi = (i − 1/2)/n have D∗
n =
1
2n
Best possible.
q q q q q
0 1
But where do we put the n+1’st point?
We cannot get D∗
n = O(1/n) along a sequence x1, x2, . . . .
Extensible sequences
Take first n points of x1, x2, x3, . . . , xn, xn+1, xn+2, . . . .
Then we can get D∗
n = O((log n)d
/n).
No known extensible constructions get O((log n)d−1
/n).
SAMSI Workshop on QMC, August 2017
QMC tutorial 19
van der Corput
qqn=1
qqqn=2
qq qqn=3
qq qqqn=4
qq qq qqn=5
qq qq qqqn=6
qq qq qq qqn=7
qq qq qq qqqn=8
qq qq qq qq qqn=9
qq qq qq qq qqqn=10
qq qq qq qq qq qqn=11
qq qq qq qq qq qqqn=12
qq qq qq qq qq qq qqn=13
qq qq qq qq qq qq qqqn=14
qq qq qq qq qq qq qq qqn=15
qq qq qq qq qq qq qq qqqn=16
qq qq qq qq qq qq qq qq qqn=17
SAMSI Workshop on QMC, August 2017
QMC tutorial 20
van der Corput
i φ2(i)
1 1 0.1 1/2 0.5
2 10 0.01 1/4 0.25
3 11 0.11 3/4 0.75
4 100 0.001 1/8 0.125
5 101 0.101 5/8 0.625
6 110 0.011 3/8 0.375
7 111 0.111 7/8 0.875
8 1000 0.0001 1/16 0.0625
9 1001 0.1001 9/16 0.5625
Take xi = φ2(i). Extensible with D∗
n = O(log(n)/n).
Commonly xi = φ2(i − 1) starts at x1 = 0. SAMSI Workshop on QMC, August 2017
QMC tutorial 21
Halton sequences
The van der Corput trick works for any base. Use bases 2, 3, 5, 7, . . .
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
72 Halton points
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
864 Halton points
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
qq
864 random points
Halton sequence in the unit square
Via base b digital expansions
i =
K
k=0
bk
aik → φb(i) ≡
K
k=0
b−1−k
aik
xi = (φ2(i), φ3(i), . . . , φp(i)) SAMSI Workshop on QMC, August 2017
QMC tutorial 22
Digital nets
Halton sequences are balanced if n is a multiple of 2a
and 3b
and 5c
. . .
Digital nets use just one base b =⇒ balance all margins equally.
Elementary intervals
Some elementary intervals in base 5
SAMSI Workshop on QMC, August 2017
QMC tutorial 23
Digital nets
E =
s
j=1
aj
bkj
aj + 1
bkj
, 0 aj < bkj
(0, m, s)-net
n = bm
points in [0, 1)s
. If vol(E) = 1/n then E has one of the n points.
e.g. Faure (1982) points, prime base b s
(t, m, s)-net
If E deserves bt
points it gets bt
points. Integer t 0.
e.g. Sobol’ (1967) points base 2
Smaller t is better (but a construction might not exist).
minT project
Sch¨urer & Schmid give bounds on t given b, m and s
Monographs
Niederreiter (1992) Dick & Pillichshammer (2010)
SAMSI Workshop on QMC, August 2017
QMC tutorial 24
Example nets
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
A (0,3,2) net
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
A (0,4,2) net
Two digital nets in base 5
The (0, 4, 2)-net is a bivariate margin of a (0, 4, 5)-net.
The parent net has 54
= 625 points in [0, 1)5
.
It balances 43,750 elementary intervals.
Think of 43,750 control variates for 625 obs.
We should remove that diagonal striping artifact (later).
SAMSI Workshop on QMC, August 2017
QMC tutorial 25
Digital net constructions
Write i =
K
k=0 aikbk
For xi1 ∈ [0, 1] take xi1 =
K
k=0 xi1kb1−k
where
xi1 ≡







xi10
xi11
...
xi1K







=







C
(1)
11 C
(1)
12 . . . C
(1)
1K
C
(1)
21 C
(1)
22 . . . C
(1)
2K
...
...
...
...
C
(1)
K1 C
(1)
K2 . . . C
(1)
KK














ai0
ai1
...
aiK







mod b
Generally xij = C(j)
ai mod b for i = 0, . . . , bm
− 1 and j = 1, . . . , s.
Good C(j)
give small t.
See Dick & PIllichshammer (2010), Niederreiter (1991)
Computational cost
About the same as a Tausworth random number generator.
Base b = 2 offers some advantages.
SAMSI Workshop on QMC, August 2017
QMC tutorial 26
Extensible nets
Nets can be extended to larger sample sizes.
(t, s)-sequence in base b
Infinite sequence of (t, m, s)-nets.
x1, . . . , xbm xbm+1, . . . , x2bm · · · xkbm+1, . . . , x(k+1)bm · · ·
(t, m, s)−net
1st
(t, m, s)−net
2nd
· · · (t, m, s)−net
b’th
(t,m+1,s)−net
· · ·
And recursively for all m t.
Examples
Sobol’ b = 2 Faure t = 0 Niederreiter & Xing b = 2 (mostly)
SAMSI Workshop on QMC, August 2017
QMC tutorial 27
Lattices
The other main family of QMC points. An extensive literature, e.g., Sloan & Joe,
Kuo, Nuyens, Dick, Cools, Hickernell, Lemieux, L’Ecuyer· · · See L’Ecuyer’s
tutorial.
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
z = (1,41)
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
z = (1,233)
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
z = (1,253)
Some lattice rules for n=377
Computation like congruential generators
xi =
i
n
,
Z2i
n
,
Z3i
n
, . . . ,
Zdi
n
(mod 1) Zj ∈ N
choose Z = (1, Z2, Z3, . . . , Zd) wisely
SAMSI Workshop on QMC, August 2017
QMC tutorial 28
QMC error estimation
|ˆµ − µ| D∗
n × VHK(f)
Not a 100% confidence interval
• D∗
n is hard to compute
• VHK harder to get than µ
• VHK = ∞ is common, e.g., f(x1, x2) = 1x1+x2 1
Koksma-Hlawka gives |ˆµ − µ| ∞
(maybe we already knew)
Also
Koksma-Hlawka is worst case. It can be very conservative.
Recent work
GAIL project of Hickernell++ allows user specified error tolerance .
SAMSI Workshop on QMC, August 2017
QMC tutorial 29
Randomized QMC
1) Make xi ∼ U[0, 1)d
individually,
2) keeping D∗
n(x1, . . . , xn) = O(n−1+
) collectively.
R independent replicates
ˆµ =
1
R
R
r=1
ˆµr
Var(ˆµ) =
1
R(R − 1)
R
r=1
(ˆµr − ˆµ)2
If VHK(f) < ∞ then
E((ˆµ − µ)2
) = O(n−2+
)
Random shift Cranley & Patterson (1976)
Scrambled nets O (1995,1997,1998)
Survey in L’Ecuyer & Lemieux (2005)
SAMSI Workshop on QMC, August 2017
QMC tutorial 30
Rotation modulo 1
q
q
q
q
q
q
q
q
q
q
q
q
q
Before
q
q
q
q
q
q
q
q
q
q
q
q
q
Shift
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
After
Cranley−Patterson rotation
Shift the points by u ∼ U[0, 1)s
with wraparound:
xi → xi + u (mod 1).
Commonly used on lattice rules.
Can also be used with nets.
At least it removes x1 = 0. SAMSI Workshop on QMC, August 2017
QMC tutorial 31
Digit scrambling
1) Chop space into b slabs. Shuffle.
2) Repeat within each of b slabs.
3) Then within b2
sub-slabs.
4) Ad infinitum b3
, b4
, . . .
5) And the same for all s coordinates.
Each xi ∼ U[0, 1)s
and x1, . . . , xn still a net (a.s.). O (1995)
SAMSI Workshop on QMC, August 2017
QMC tutorial 32
Digit scrambling
Mathematically it is a permutation operation on the base b digits of
xij =
K
k=1
aijk b−k
→ ˜xij =
K
k=1
˜aijk b−k
The mapping aijk → ˜aijk = πjk(aijk) preserves the net property.
In the previous ‘nested’ scramble πjk also depends on aij1, . . . , aij,k−1
Simpler scrambles
Random linear permutations
a → g + h × a mod p (prime p = b)
g ∼ U{0, 1, . . . , b − 1}, h ∼ U{1, 2, . . . , b − 1}
Matousek (1998) has nested linear permutations
Digital shift
aijk → aijk + gij (mod p)
For b = 2 xi → xi = xi ⊕ u bitwise XOR
SAMSI Workshop on QMC, August 2017
QMC tutorial 33
Example scrambles
Two components of the first 530 points of a Faure (0, 53)-net in base 53.
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
Digital shift
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
Random linear
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
Nested uniform
Randomized Faure points
The digital shift is much like a Cranley-Patterson rotation.
It uses just one random u for all points: xi = xi ⊕ u.
Random linear Matousek (1998) and nested uniform O (1995) have the same
Var(ˆµ).
SAMSI Workshop on QMC, August 2017
QMC tutorial 34
Unscrambled Faure
First n = 112
= 121 points of Faure (0, 11)-net in [0, 1]11
.
−2 −1 0 1 2
−1.0−0.50.00.51.0
qqqqqqqqqqqq
q qq
qq
qqq
qqq
q
qq qqq
q
qqq q
qq
qq
q q
qqqqqq
q
q
q
qqq qqq
q
qqqq
qqq
q
q q
q q
q
qqq
qqqq
q
qqq qqq
q
q
q
qqqqqq
q q
qq
qq
q qqq
q
qqq qq
q
qqq
qqq
qq
qq q
q
2x2 ++ x3 −− x6 ++ x8 −− 2x10 −− x11
x3−−x6−−x8++x11
−1.0 −0.5 0.0 0.5 1.0−1.0−0.50.00.51.0
qqqqqqqqqqqq q q
q
qqqqqq
q
qq qq qq
qq
q
qq
q
qq qqq qq
q
qqq
qqq q
qq
q qqq
qqqq
q q
qqq
q q
q q
qqq
q q
qqqq
qqq q
qq
q qqq
qqq
q
qq qqq qq
q
qq
q
qq
qq qq qq
q
qqqqqq
q
q q q
x4 ++ x6 −− x10 −− x11
x1−−x2−−x8++x9
Two projections of 121 Faure points
Unscrambled points are very structured.
SAMSI Workshop on QMC, August 2017
QMC tutorial 35
Sobol’ projections
Using code from Bratley & Fox (1988).
Implementations differ based on choices of ‘direction numbers’.
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
x2 versus x1
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
x38 versus x37
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
x26 versus x31
Three projections of 1024 Sobol points
At some higher sample size the empty blocks in x26 vs x31 fill in.
Expect empty blocks in higher order margins to last longer than those in low
dimensional margins.
SAMSI Workshop on QMC, August 2017
QMC tutorial 36
Scrambled net properties
Using σ2
= (f(x) − µ)2
dx
If Then N.B.
f ∈ L2
Var(ˆµ) = o(1/n) even if VHK(f) = ∞
f ∈ L2
Var(ˆµ) Γt,b,s σ2
/n for t = 0, Γ exp(1)
.
= 2.718
∂1,2,...,s
f ∈ L2
Var(ˆµ) = O(log(n)s−1
/n3
) O (1997,2008)
Γ < ∞ rules out (log n)s−1
catastrophe at finite n.
Loh (2003) has a CLT for t = 0 (and fully scrambled points).
Geometrically
Scrambling Faure breaks up the diagonal striping of the nets.
Scrambling Sobol’ points moves the full / empty blocks around.
Improved rate
RMSE is O(n−1/2
) better than QMC rate (cancellation).
Holds for nested uniform and nested linear scrambles.
SAMSI Workshop on QMC, August 2017
QMC tutorial 37
Scrambling vs shifting
Consider n = 2m
points in [0, 1).
QMC
van der Corput points (i − 1)/n for i = 1, . . . , n.
• · · · · · · · · · |• · · · · · · · · ·|• · · · · · · · · ·|• · · · · · · · · ·|
Shift
Shift all points by U ∼ U(0, 1) with wraparound.
Get one point in each [(i − 1)/n, i/n)
· · · • · · · · · · |· · · • · · · · · ·|· · · • · · · · · ·|· · · • · · · · · ·|
Scramble
Get a stratified sample, independent xi ∼ U[(i − 1)/n, i/n)
· · · • · · · · · · |· · · · · · · · • ·|· · · · · • · · · ·|· · • · · · · · · ·|
Random errors cancel yielding an O(n−1/2
) improvement.
SAMSI Workshop on QMC, August 2017
QMC tutorial 38
Higher order nets
Results from Dick, Baldeaux
Start with a net zi ∈ [0, 1)2s
dimensions.
‘Interleave’ digits of two variables to make a new one:
zi,2j = 0.g1g2g3 · · ·
zi,2j+1 = 0.h1h2h3 · · ·
−→ xi,j = 0.g1h1g2h2 · · · .
Error is O(n−2+
) under increased smoothness: ∂2s
∂x2
1...∂x2
d
f
Scrambling gets RMSE O(n−2−1/2+
)
Even higher
Start with ks dimensions interleave down to s.
Get O(n−k+
) and O(n−k−1/2+
) (under still higher smoothness)
Upshot
Very promising. Cost: many inputs and much smoothness.
Starting to be used in PDEs. Tutorial of Frances Kuo.
SAMSI Workshop on QMC, August 2017
QMC tutorial 39
The curse of dimension
Curse of dimension: larger d makes integration harder.
Cr
M = f : [0, 1]d
→ R
j
∂αj
∂x
αj
j
f M,
j
αj = r, αj 0
Bahkvalov I:
For any x1, . . . , xn ∈ [0, 1]d
there is f ∈ Cr
M with |ˆµn − µ| kn−r/d
Ordinary QMC like r = d
Bahkvalov II:
Random points can’t beat RMSE O(n−r/d−1/2
)
Ordinary MC like r = 0
SAMSI Workshop on QMC, August 2017
QMC tutorial 40
What if we beat those rates?
Sometimes we get high accuracy for large d.
It does not mean we beat the curse of dimensionality.
Bahkvalov never promised universal failure.
Only the existence of hard cases.
We may have just had an easy, non-worst case function.
Two kinds of easy
• Truncation: only the first s d components of x matter
• Superposition: the components only matter “s at a time”
Either way
f might not be “fully d-dimensional”.
SAMSI Workshop on QMC, August 2017
QMC tutorial 41
Dimensional decomposition
For u = {j1, j2, . . . , jr} ⊂ 1:d ≡ {1, 2, . . . , d} let
xu = (xj1 , . . . , xjr )
xi,u = (xij1
, . . . , xijr
)
Via ANOVA or other method, write
f(x) =
u⊆1:d
fu(xu)
Then
ˆµ − µ =
u⊆1:d
1
n
n
i=1
fu(xi,u) − fu(xu) dxu
|ˆµ − µ|
u⊆1:d
D∗
n(x1,u, . . . , xn,u) × fu
Often D∗
n(xi,u) D∗
n(xi) for small |u|. If also fu is small for large u,
then all the terms are small.
SAMSI Workshop on QMC, August 2017
QMC tutorial 42
Studying the good cases
Two main tools to describe it
• Weighted spaces and tractability
• ANOVA and effective dimension
Implications
Neither causes the curse to be lifted. They describe the happy circumstance
where the curse did not apply.
Both leave important gaps described below. I’ll raise as an open problem later.
SAMSI Workshop on QMC, August 2017
QMC tutorial 43
Weighted spaces
Hickernell (1996), Sloan & Wozniakowski (1998),
Dick, Kuo, Novak, Wasilkowski, many more
See tutorial by Hickernell
∂u
≡
j∈u
∂
∂xj
assume ∂1:d
f exists
Inner product, weights γu > 0
f 2
γ =
u⊆1:d
1
γu [0,1]u [0,1]−u
∂u
f(x) dx−u
2
dxu
Function ball Bγ,C = {f | f γ C}
Small γu =⇒ only small ∂u
f in ball.
Product weights
γu = j∈u γj where γj decrease rapidly with j.
Now f ∈ Bγ,C implies ∂u
f small when |u| large.
Many more choices: Dick, Kuo, Sloan (2013)
SAMSI Workshop on QMC, August 2017
QMC tutorial 44
ANOVA and effective dimension
Caflisch, Morokoff & O (1997)
ANOVA: f(x) = u⊆1:d fu(x)
Often f is dominated by its low order interactions.
Then RQMC may make a huge improvement.
Let σ2
u = Var(fu) variance component
Truncation dim. s d
u⊆1:s
σ2
u 0.99
u⊆1:d
σ2
u
Superposition dim. s d
|u| s
σ2
u 0.99
u⊆1:d
σ2
u
Mean dimension
u |u|σ2
u
u σ2
u
Liu & O (2006) Easier to estimate via Sobol’ indices.
SAMSI Workshop on QMC, August 2017
QMC tutorial 45
Open problem
• ANOVA captures magnitude of the low dimensional parts but not their
smoothness. Even when it verifies that f is dominated by low dimensional
parts it does not assure small |ˆµ − µ|.
• Weighted space models assure accurate estimation at least asymptotically.
However, it is not easy to decide which weighted space to use.
• Given a weighted space, there are algorithms to tune QMC points for it.
• ANOVA approaches may support an approach to choosing good weights for a
given problem, building on Sobol’ indices Sobol’++, Saltelli++, Prieur++,
Kucherenko++ or active subspaces Constantine++
The problems are about how to combine the approaches and when / whether a
resulting adaptive algorithm will be effective.
SAMSI Workshop on QMC, August 2017
QMC tutorial 46
Description vs engineering
Effective dimension and weighted spaces both describe settings where f can be
integrated well.
Some authors try to engineer changes in f to make it more suited to QMC.
E.g., cram importance into first few components of x.
MC vs QMC
MC places lots of effort on variance reduction.
For QMC we gain by reducing effective dimension.
Or Hardy-Krause variation (but there asymptotics are slow).
E.g., when turning u ∼ U[0, 1]d
into z ∼ N(0, Σ)
The choice of Σ1/2
affects QMC performance.
Caflisch, Morokoff & O (1997)
Acworth, Broadie & Glasserman (1998)
Imai & Tan (2014)
Best Σ can depend strongly on f. Papageorgiou (2002) SAMSI Workshop on QMC, August 2017
QMC tutorial 47
Sampling Brownian motion
Feynman-Kac/Brownian bridge
First few variables define a ‘skeleton’. The rest fill in.
0.0 0.2 0.4 0.6 0.8 1.0
−1.0−0.50.00.5
Time t
B(t)
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
Brownian bridge construction of Brownian motion
See also Mike Giles++ on multi-level MC. SAMSI Workshop on QMC, August 2017
QMC tutorial 48
QMC ∩ MCMC
• Replace IID ui inputs to MCMC by something better.
• Completely uniformly distributed (CUD).
Like entire period of a (small) RNG.
• Very early refs:
Chentsov (1967)
Finite state space. Uses ’inversion.’
Beautiful coupling argument. (Resembles coupling from the past.)
Sobol’ (1974)
Has n × ∞ points xij ∈ [0, 1]
Samples from a row until a return to start state, then goes to next row
Gets rate O(1/n) · · · if transition probabilities are a/2b
for integers a, b
SAMSI Workshop on QMC, August 2017
QMC tutorial 49
QMC ∩ MCMC
For MCMC we follow one particle.
1) Step xi ← ψ(xi−1, vi) for vi ∈ (0, 1)d
.
2) For v1 = (u1, . . . , ud), v2 = (ud+1, . . . , u2d) etc.
n steps require u1, . . . , und ∈ (0, 1)
3) MC-MC uses ui ∼ U(0, 1)
4) MC-QMC uses ui CUD
Reasons for caution
1) We’re using 1 point in [0, 1]nd
with n → ∞
2) Our xi won’t be Markovian
SAMSI Workshop on QMC, August 2017
QMC tutorial 50
MCMC ≈ QMCT
Method Rows Columns
QMC n points d variables 1 d n → ∞
MCMC r replicates n steps 1 r n → ∞
QMC MCMC
QMC based on equidistribution MCMC based on ergodicity
SAMSI Workshop on QMC, August 2017
QMC tutorial 51
Some results
1) Metropolis on CUD points consistent in finite state space O & Tribble (2005)
2) Weakly CUD random points ok Tribble & O (2008)
3) Consistency in continuous problems (Gibbs and Metropolis)
Chen, Dick & O (2010)
4) Chen (2011) rate O(n−1+
) for some smooth ARMAs
5) Mackey & Gorham find best results are for µ = E(xj).
SAMSI Workshop on QMC, August 2017
QMC tutorial 52
Sequential QMC
JRSS-B discussion paper by Chopin & Gerber (2015)
N particles in Rd
for T time steps
Advance xnt to xn,t+1 = φ(xnt, un,t+1) each u ∈ [0, 1)S
.
Do them all with Ut+1 ∈ [0, 1)N×S
. Usually IID.
For QMC
We want rows of Ut+1 to be ‘balanced’ wrt position of xt
Easy if d = S = 1 L´ecot & Tuffin (2004)
Take Ut+1 ∈ [0, 1)N×2
Sort first column along xn,t. Update points with remaining column.
What to do if d > 1?
Alignment is tricky because Rd
is not ordered.
Related work by L’Ecuyer, L´ecot, L’Archevˆeque-Gaudet on array-RQMC.
d-dimensional alignments. Huge variance reductions. Limited theory.
SAMSI Workshop on QMC, August 2017
QMC tutorial 53
Using Hilbert curves
These are space-filling curves as m → ∞:
m= 1 m= 2 m= 3 m= 4
Chopin & Gerber thread a curve through xn,t ∈ Rd
Align particles via first column of RQMC points Ut+1 ∈ [0, 1)N×(1+S)
.
Use remaining S columns to advance them.
Results
Squared error is o(N−1
).
Good empirical results for modest d
NB:
I left out lots of details about particle weighting.
SAMSI Workshop on QMC, August 2017
QMC tutorial 54
Thanks
• NSF support over the years, most recently:
DMS-1407397 and DMS-1521145.
• Co-leaders: Hickernell, Kuo, L’Ecuyer
• SAMSI, especially Ilse Ipsen
• Sue McDonald
SAMSI Workshop on QMC, August 2017

More Related Content

What's hot

Patch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesPatch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesFrank Nielsen
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Frank Nielsen
 

What's hot (20)

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Patch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesPatch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective Divergences
 
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
 Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli... Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
 

Similar to Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics Opening Workshop, Tutorial: Quasi-Monte Carlo Introduction - Art Owen, Aug 28, 2017

Stratified sampling and resampling for approximate Bayesian computation
Stratified sampling and resampling for approximate Bayesian computationStratified sampling and resampling for approximate Bayesian computation
Stratified sampling and resampling for approximate Bayesian computationUmberto Picchini
 
A study of the worst case ratio of a simple algorithm for simple assembly lin...
A study of the worst case ratio of a simple algorithm for simple assembly lin...A study of the worst case ratio of a simple algorithm for simple assembly lin...
A study of the worst case ratio of a simple algorithm for simple assembly lin...narmo
 
Monte Carlo Tree Search in 2014 (MCMC days in Marseille)
Monte Carlo Tree Search in 2014 (MCMC days in Marseille)Monte Carlo Tree Search in 2014 (MCMC days in Marseille)
Monte Carlo Tree Search in 2014 (MCMC days in Marseille)Olivier Teytaud
 
Markov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learningMarkov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learningAndres Mendez-Vazquez
 
Litvinenko low-rank kriging +FFT poster
Litvinenko low-rank kriging +FFT  posterLitvinenko low-rank kriging +FFT  poster
Litvinenko low-rank kriging +FFT posterAlexander Litvinenko
 
SIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithmsSIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithmsJagadeeswaran Rathinavel
 
A multi-objective optimization framework for a second order traffic flow mode...
A multi-objective optimization framework for a second order traffic flow mode...A multi-objective optimization framework for a second order traffic flow mode...
A multi-objective optimization framework for a second order traffic flow mode...Guillaume Costeseque
 
Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...IJRTEMJOURNAL
 
Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...journal ijrtem
 
Some history of quantum groups
Some history of quantum groupsSome history of quantum groups
Some history of quantum groupsDaniel Tubbenhauer
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clusteringFrank Nielsen
 
Matrix Completion Presentation
Matrix Completion PresentationMatrix Completion Presentation
Matrix Completion PresentationMichael Hankin
 

Similar to Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics Opening Workshop, Tutorial: Quasi-Monte Carlo Introduction - Art Owen, Aug 28, 2017 (20)

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Stratified sampling and resampling for approximate Bayesian computation
Stratified sampling and resampling for approximate Bayesian computationStratified sampling and resampling for approximate Bayesian computation
Stratified sampling and resampling for approximate Bayesian computation
 
A study of the worst case ratio of a simple algorithm for simple assembly lin...
A study of the worst case ratio of a simple algorithm for simple assembly lin...A study of the worst case ratio of a simple algorithm for simple assembly lin...
A study of the worst case ratio of a simple algorithm for simple assembly lin...
 
SASA 2016
SASA 2016SASA 2016
SASA 2016
 
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
 
QCD Phase Diagram
QCD Phase DiagramQCD Phase Diagram
QCD Phase Diagram
 
Monte Carlo Tree Search in 2014 (MCMC days in Marseille)
Monte Carlo Tree Search in 2014 (MCMC days in Marseille)Monte Carlo Tree Search in 2014 (MCMC days in Marseille)
Monte Carlo Tree Search in 2014 (MCMC days in Marseille)
 
Markov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learningMarkov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learning
 
Litvinenko low-rank kriging +FFT poster
Litvinenko low-rank kriging +FFT  posterLitvinenko low-rank kriging +FFT  poster
Litvinenko low-rank kriging +FFT poster
 
SIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithmsSIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithms
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
 
A multi-objective optimization framework for a second order traffic flow mode...
A multi-objective optimization framework for a second order traffic flow mode...A multi-objective optimization framework for a second order traffic flow mode...
A multi-objective optimization framework for a second order traffic flow mode...
 
QMC: Transition Workshop - Density Estimation by Randomized Quasi-Monte Carlo...
QMC: Transition Workshop - Density Estimation by Randomized Quasi-Monte Carlo...QMC: Transition Workshop - Density Estimation by Randomized Quasi-Monte Carlo...
QMC: Transition Workshop - Density Estimation by Randomized Quasi-Monte Carlo...
 
Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...
 
Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...Existence results for fractional q-differential equations with integral and m...
Existence results for fractional q-differential equations with integral and m...
 
Some history of quantum groups
Some history of quantum groupsSome history of quantum groups
Some history of quantum groups
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
cswiercz-general-presentation
cswiercz-general-presentationcswiercz-general-presentation
cswiercz-general-presentation
 
Matrix Completion Presentation
Matrix Completion PresentationMatrix Completion Presentation
Matrix Completion Presentation
 
Talk iccf 19_ben_hammouda
Talk iccf 19_ben_hammoudaTalk iccf 19_ben_hammouda
Talk iccf 19_ben_hammouda
 

More from The Statistical and Applied Mathematical Sciences Institute

More from The Statistical and Applied Mathematical Sciences Institute (20)

Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
 
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
 
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
 
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
 
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
 
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
 
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
 
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
 
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
 
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
 
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
 
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
 
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
 
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
 
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
 

Recently uploaded

This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfSherif Taha
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Association for Project Management
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseAnaAcapella
 
Dyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxDyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxcallscotland1987
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701bronxfugly43
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and ModificationsMJDuyan
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 
Third Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptxThird Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptxAmita Gupta
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 

Recently uploaded (20)

This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Dyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxDyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptx
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Third Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptxThird Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptx
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics Opening Workshop, Tutorial: Quasi-Monte Carlo Introduction - Art Owen, Aug 28, 2017

  • 1. QMC tutorial 1 Quasi-Monte Carlo A tutorial introduction Art B. Owen Stanford University MCQMC 2018 will be in Rennes, July 1–8 2018 SAMSI Workshop on QMC, August 2017
  • 2. QMC tutorial 2 MC and QMC and other points Monte Carlo (MC) and quasi-Monte Carlo (QMC) methods come down to sampling the input space of a function. q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q Top row, left to right MC, grid, and two QMC methods Top row, left to right Two more QMC, sparse grid, blue noise SAMSI Workshop on QMC, August 2017
  • 3. QMC tutorial 3 Outline 1) What QMC is (utmost stratification) 2) Why it works (discrepancy, variation and Koksma-Hlawka) 3) How it works (constructions) 4) Randomized QMC 5) When it works best (effective dimension, tractability, weighted spaces) (room for Bayesian thinking here) 6) (maybe) QMC ∩ MCMC 7) (maybe) QMC ∩ Particles SAMSI Workshop on QMC, August 2017
  • 4. QMC tutorial 4 Landmark papers in MC Some landmark papers where Monte Carlo was applied: • Physics Metropolis et al. (1953) • Chemistry (reaction equations) Gillespie (1977) • Financial valuation Boyle (1977) • Bootstrap resampling Efron (1979) • OR (discrete event simulation) Tocher & Owen (1960) • Bayes (maybe 5 landmarks in early days) • Nonsmooth optimization Kirkpatrick et al. (1983) • Computer graphics (path tracing) Kajiya (1988) SAMSI Workshop on QMC, August 2017
  • 5. QMC tutorial 5 Landmark uses of QMC • Particle transport methods in physics / medical imaging Jerome Spanier++ • Financial valuation, some early examples Paskov & Traub 1990s • Graphical rendering Alex Keller++ (They got an Oscar!) • Solving PDEs Frances Kuo, Christoph Schwab++, 2015 • Particle methods Chopin & Gerber (2015) The next landmark methods Some strong candidate areas: • machine learning • Bayes • uncertainty quantification (UQ) Clementine Prieur’s tutorial is in UQ. SAMSI Workshop on QMC, August 2017
  • 6. QMC tutorial 6 The next landmark QMC methods dominate when we have to integrate a high dimensional function that is nearly a sum of smooth functions of just a few of its input variables. It is hard to tell a priori that this will happen. The original integrand might fail to be smooth and yet it could be nearly a sum of smooth functions. Sloan, Kuo, Griebel The best way is to find out is to run QMC on some example problems. SAMSI Workshop on QMC, August 2017
  • 7. QMC tutorial 7 MC and QMC We estimate µ = [0,1]d f(x) dx by ˆµ = 1 n n i=1 f(xi), xi ∈ [0, 1]d In plain MC, the xi are IID U[0, 1]d . In QMC they’re ’spread evenly’. Non uniform µ = Ω f(x)p(x) dx and ˆµ = 1 n n i=1 f(ψ(ui)), ui ∈ [0, 1]s ψ : [0, 1]s → Rd If u ∼ U[0, 1]s then x = ψ(u) ∼ p Many methods fit this framework. Devroye (1986) Acceptance-rejection remains awkward. SAMSI Workshop on QMC, August 2017
  • 8. QMC tutorial 8 Illustration q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q Monte Carlo q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q Fibonacci lattice MC and two QMC methods in the unit square q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q Hammersley sequence MC points always have clusters and gaps. What is random is where they appear. QMC points avoid clusters and gaps to the extent that mathematics permits. SAMSI Workshop on QMC, August 2017
  • 9. QMC tutorial 9 Measuring uniformity We need a way to verify that the points xi are ‘spread out’ in [0, 1]d . The most fruitful way is to show that U {x1, x2, . . . , xn} . = U[0, 1]d Discrepancy A discrepancy is a distance F − ˆFn between measures F = U[0, 1]d and ˆFn = U {x1, x2, . . . , xn}. There are many discrepancies. SAMSI Workshop on QMC, August 2017
  • 10. QMC tutorial 10 Local discrepancy Did the box [0, a) get it’s fair share of points? q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q a 0 q 0.6 0.70 b 0 q 0.42 0.45 Local discrepancy at a, b δ(a) = ˆFn([0, a)) − F([0, a)) = 13 32 − 0.6 × 0.7 = −0.01375 Star discrepancy D∗ n = D∗ n(x1, . . . , xn) = sup a∈[0,1)d |δ(a)| i.e., δ ∞ For d = 1 this is Kolmogorov-Smirnov. SAMSI Workshop on QMC, August 2017
  • 11. QMC tutorial 11 More discrepancies D∗ n = sup a∈[0,1)d ˆFn([0, a)) − F([0, a)) Dn = sup a,b∈[0,1)d ˆFn([a, b)) − F([a, b)) D∗ n Dn 2d D∗ n Lp discrepancies D∗p n = [0,1)d |δ(a)|p da 1/p e.g., Warnock Also Wrap-around discrepancies Hickernell Discrepancies over (triangles, rotated rectangles, balls · · · convex sets · · · ). Beck, Chen, Schmidt, Brandolini, Travaglini, Colzani, Gigante, Cools, Pillards Best results are only for axis-aligned hyper-rectangles. SAMSI Workshop on QMC, August 2017
  • 12. QMC tutorial 12 QMC’s law of large numbers 1) If f is Riemann integrable on [0, 1]d , and 2) D∗ n(x1, . . . , xn) → 0 Then 1 n n i=1 f(xi) → [0,1]d f(x) dx How fast? MC has the CLT. QMC has the Koksma-Hlawka inequality. SAMSI Workshop on QMC, August 2017
  • 13. QMC tutorial 13 Koksma’s inequality For d = 1 |ˆµ − µ| D∗ n(x1, . . . , xn) × 1 0 |f (x)| dx NB: D∗ n = δ ∞ and 1 0 |f (x)| dx is the total variation V (f). Setup for the proof 1 0 f(x) dx = f(1) − 1 0 xf (x) dx Integration by parts 1 n n i=1 f(xi) = f(1) − 1 n n i=0 i(f(xi+1) − f(xi) = xi+1 xi f (x) dx ) Summation by parts A few more steps, use f continuous |µ − ˆµ| = · · · = 1 0 δ(x)f (x) dx δ ∞ f 1 = D∗ n × V (f) SAMSI Workshop on QMC, August 2017
  • 14. QMC tutorial 14 Koksma-Hlawka theorem 1 n n i=1 f(xi) − [0,1)d f(x) dx D∗ n × VHK(f) VHK is the total variation in the sense of Hardy (1905) and Krause (1903) Multidimensional variation has a few surprises for us. Puzzler Is this a 100% confidence interval? SAMSI Workshop on QMC, August 2017
  • 15. QMC tutorial 15 Rates of convergence It is possible to get D∗ n = O log(n)d−1 n . Then |ˆµ − µ| = o(n−1+ ) vs Op(n−1/2 ) for MC What about those logs? Maybe log(n)d−1 /n 1/ √ n Low effective dimension (later) counters them As do some randomizations (later) Roth (1954) D∗ n = o log(n)(d−1)/2 n is unattainable Gap between log(n)(d−1)/2 and log(n)d−1 subject to continued work. E.g., Lacey, Bilyk SAMSI Workshop on QMC, August 2017
  • 16. QMC tutorial 16 Tight and loose bounds They are not mutually exclusive. Koksma-Hlawka is tight |ˆµ − µ| (1 − )D∗ n(x1, . . . , xn) × VHK(f) fails for some f KH holds as an equality for a worst case function, e.g., f . = ±δ. It even holds if an adversary sees your xi before picking f. Koksma-Hlawka is also very loose It can greatly over-estimate actual error. Usually δ and f are dissimilar. ˆµ − µ = − δ, f Just like Chebychev’s inequality It is also tight and very loose. E.g., Pr |N(0, 1)| 10 0.01 is loose. Yes: 1.5 × 10−23 10−2 SAMSI Workshop on QMC, August 2017
  • 17. QMC tutorial 17 Variation Multidimensional Hardy-Krause variation has surprises for us. O (2005) f(x1, x2) =    1, x1 + x2 1/2 0, else VHK(f) = ∞ on [0, 1]2 VHK(f ) < ∞, for some f with f − f 1 < Cusps For general a ∈ Rd , f = max(aT x, 0) =⇒ VHK(f) = ∞ for d 3 f = max(aT x, 0)2 =⇒ VHK(f) = ∞ for d 4 QMC-friendly discontinuities Axis parallel discontinuities may have VHK < ∞. Used by e.g., X. Wang, I. Sloan, Z. He SAMSI Workshop on QMC, August 2017
  • 18. QMC tutorial 18 Extensibility For d = 1, the equispaced points xi = (i − 1/2)/n have D∗ n = 1 2n Best possible. q q q q q 0 1 But where do we put the n+1’st point? We cannot get D∗ n = O(1/n) along a sequence x1, x2, . . . . Extensible sequences Take first n points of x1, x2, x3, . . . , xn, xn+1, xn+2, . . . . Then we can get D∗ n = O((log n)d /n). No known extensible constructions get O((log n)d−1 /n). SAMSI Workshop on QMC, August 2017
  • 19. QMC tutorial 19 van der Corput qqn=1 qqqn=2 qq qqn=3 qq qqqn=4 qq qq qqn=5 qq qq qqqn=6 qq qq qq qqn=7 qq qq qq qqqn=8 qq qq qq qq qqn=9 qq qq qq qq qqqn=10 qq qq qq qq qq qqn=11 qq qq qq qq qq qqqn=12 qq qq qq qq qq qq qqn=13 qq qq qq qq qq qq qqqn=14 qq qq qq qq qq qq qq qqn=15 qq qq qq qq qq qq qq qqqn=16 qq qq qq qq qq qq qq qq qqn=17 SAMSI Workshop on QMC, August 2017
  • 20. QMC tutorial 20 van der Corput i φ2(i) 1 1 0.1 1/2 0.5 2 10 0.01 1/4 0.25 3 11 0.11 3/4 0.75 4 100 0.001 1/8 0.125 5 101 0.101 5/8 0.625 6 110 0.011 3/8 0.375 7 111 0.111 7/8 0.875 8 1000 0.0001 1/16 0.0625 9 1001 0.1001 9/16 0.5625 Take xi = φ2(i). Extensible with D∗ n = O(log(n)/n). Commonly xi = φ2(i − 1) starts at x1 = 0. SAMSI Workshop on QMC, August 2017
  • 21. QMC tutorial 21 Halton sequences The van der Corput trick works for any base. Use bases 2, 3, 5, 7, . . . q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 72 Halton points q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 864 Halton points q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq 864 random points Halton sequence in the unit square Via base b digital expansions i = K k=0 bk aik → φb(i) ≡ K k=0 b−1−k aik xi = (φ2(i), φ3(i), . . . , φp(i)) SAMSI Workshop on QMC, August 2017
  • 22. QMC tutorial 22 Digital nets Halton sequences are balanced if n is a multiple of 2a and 3b and 5c . . . Digital nets use just one base b =⇒ balance all margins equally. Elementary intervals Some elementary intervals in base 5 SAMSI Workshop on QMC, August 2017
  • 23. QMC tutorial 23 Digital nets E = s j=1 aj bkj aj + 1 bkj , 0 aj < bkj (0, m, s)-net n = bm points in [0, 1)s . If vol(E) = 1/n then E has one of the n points. e.g. Faure (1982) points, prime base b s (t, m, s)-net If E deserves bt points it gets bt points. Integer t 0. e.g. Sobol’ (1967) points base 2 Smaller t is better (but a construction might not exist). minT project Sch¨urer & Schmid give bounds on t given b, m and s Monographs Niederreiter (1992) Dick & Pillichshammer (2010) SAMSI Workshop on QMC, August 2017
  • 24. QMC tutorial 24 Example nets ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● A (0,3,2) net ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● A (0,4,2) net Two digital nets in base 5 The (0, 4, 2)-net is a bivariate margin of a (0, 4, 5)-net. The parent net has 54 = 625 points in [0, 1)5 . It balances 43,750 elementary intervals. Think of 43,750 control variates for 625 obs. We should remove that diagonal striping artifact (later). SAMSI Workshop on QMC, August 2017
  • 25. QMC tutorial 25 Digital net constructions Write i = K k=0 aikbk For xi1 ∈ [0, 1] take xi1 = K k=0 xi1kb1−k where xi1 ≡        xi10 xi11 ... xi1K        =        C (1) 11 C (1) 12 . . . C (1) 1K C (1) 21 C (1) 22 . . . C (1) 2K ... ... ... ... C (1) K1 C (1) K2 . . . C (1) KK               ai0 ai1 ... aiK        mod b Generally xij = C(j) ai mod b for i = 0, . . . , bm − 1 and j = 1, . . . , s. Good C(j) give small t. See Dick & PIllichshammer (2010), Niederreiter (1991) Computational cost About the same as a Tausworth random number generator. Base b = 2 offers some advantages. SAMSI Workshop on QMC, August 2017
  • 26. QMC tutorial 26 Extensible nets Nets can be extended to larger sample sizes. (t, s)-sequence in base b Infinite sequence of (t, m, s)-nets. x1, . . . , xbm xbm+1, . . . , x2bm · · · xkbm+1, . . . , x(k+1)bm · · · (t, m, s)−net 1st (t, m, s)−net 2nd · · · (t, m, s)−net b’th (t,m+1,s)−net · · · And recursively for all m t. Examples Sobol’ b = 2 Faure t = 0 Niederreiter & Xing b = 2 (mostly) SAMSI Workshop on QMC, August 2017
  • 27. QMC tutorial 27 Lattices The other main family of QMC points. An extensive literature, e.g., Sloan & Joe, Kuo, Nuyens, Dick, Cools, Hickernell, Lemieux, L’Ecuyer· · · See L’Ecuyer’s tutorial. q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q z = (1,41) q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q z = (1,233) q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q z = (1,253) Some lattice rules for n=377 Computation like congruential generators xi = i n , Z2i n , Z3i n , . . . , Zdi n (mod 1) Zj ∈ N choose Z = (1, Z2, Z3, . . . , Zd) wisely SAMSI Workshop on QMC, August 2017
  • 28. QMC tutorial 28 QMC error estimation |ˆµ − µ| D∗ n × VHK(f) Not a 100% confidence interval • D∗ n is hard to compute • VHK harder to get than µ • VHK = ∞ is common, e.g., f(x1, x2) = 1x1+x2 1 Koksma-Hlawka gives |ˆµ − µ| ∞ (maybe we already knew) Also Koksma-Hlawka is worst case. It can be very conservative. Recent work GAIL project of Hickernell++ allows user specified error tolerance . SAMSI Workshop on QMC, August 2017
  • 29. QMC tutorial 29 Randomized QMC 1) Make xi ∼ U[0, 1)d individually, 2) keeping D∗ n(x1, . . . , xn) = O(n−1+ ) collectively. R independent replicates ˆµ = 1 R R r=1 ˆµr Var(ˆµ) = 1 R(R − 1) R r=1 (ˆµr − ˆµ)2 If VHK(f) < ∞ then E((ˆµ − µ)2 ) = O(n−2+ ) Random shift Cranley & Patterson (1976) Scrambled nets O (1995,1997,1998) Survey in L’Ecuyer & Lemieux (2005) SAMSI Workshop on QMC, August 2017
  • 30. QMC tutorial 30 Rotation modulo 1 q q q q q q q q q q q q q Before q q q q q q q q q q q q q Shift q q q q q q q q q q q q q q q q q q q q q q q q q q After Cranley−Patterson rotation Shift the points by u ∼ U[0, 1)s with wraparound: xi → xi + u (mod 1). Commonly used on lattice rules. Can also be used with nets. At least it removes x1 = 0. SAMSI Workshop on QMC, August 2017
  • 31. QMC tutorial 31 Digit scrambling 1) Chop space into b slabs. Shuffle. 2) Repeat within each of b slabs. 3) Then within b2 sub-slabs. 4) Ad infinitum b3 , b4 , . . . 5) And the same for all s coordinates. Each xi ∼ U[0, 1)s and x1, . . . , xn still a net (a.s.). O (1995) SAMSI Workshop on QMC, August 2017
  • 32. QMC tutorial 32 Digit scrambling Mathematically it is a permutation operation on the base b digits of xij = K k=1 aijk b−k → ˜xij = K k=1 ˜aijk b−k The mapping aijk → ˜aijk = πjk(aijk) preserves the net property. In the previous ‘nested’ scramble πjk also depends on aij1, . . . , aij,k−1 Simpler scrambles Random linear permutations a → g + h × a mod p (prime p = b) g ∼ U{0, 1, . . . , b − 1}, h ∼ U{1, 2, . . . , b − 1} Matousek (1998) has nested linear permutations Digital shift aijk → aijk + gij (mod p) For b = 2 xi → xi = xi ⊕ u bitwise XOR SAMSI Workshop on QMC, August 2017
  • 33. QMC tutorial 33 Example scrambles Two components of the first 530 points of a Faure (0, 53)-net in base 53. q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q Digital shift q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q Random linear q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q Nested uniform Randomized Faure points The digital shift is much like a Cranley-Patterson rotation. It uses just one random u for all points: xi = xi ⊕ u. Random linear Matousek (1998) and nested uniform O (1995) have the same Var(ˆµ). SAMSI Workshop on QMC, August 2017
  • 34. QMC tutorial 34 Unscrambled Faure First n = 112 = 121 points of Faure (0, 11)-net in [0, 1]11 . −2 −1 0 1 2 −1.0−0.50.00.51.0 qqqqqqqqqqqq q qq qq qqq qqq q qq qqq q qqq q qq qq q q qqqqqq q q q qqq qqq q qqqq qqq q q q q q q qqq qqqq q qqq qqq q q q qqqqqq q q qq qq q qqq q qqq qq q qqq qqq qq qq q q 2x2 ++ x3 −− x6 ++ x8 −− 2x10 −− x11 x3−−x6−−x8++x11 −1.0 −0.5 0.0 0.5 1.0−1.0−0.50.00.51.0 qqqqqqqqqqqq q q q qqqqqq q qq qq qq qq q qq q qq qqq qq q qqq qqq q qq q qqq qqqq q q qqq q q q q qqq q q qqqq qqq q qq q qqq qqq q qq qqq qq q qq q qq qq qq qq q qqqqqq q q q q x4 ++ x6 −− x10 −− x11 x1−−x2−−x8++x9 Two projections of 121 Faure points Unscrambled points are very structured. SAMSI Workshop on QMC, August 2017
  • 35. QMC tutorial 35 Sobol’ projections Using code from Bratley & Fox (1988). Implementations differ based on choices of ‘direction numbers’. q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q x2 versus x1 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q x38 versus x37 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q x26 versus x31 Three projections of 1024 Sobol points At some higher sample size the empty blocks in x26 vs x31 fill in. Expect empty blocks in higher order margins to last longer than those in low dimensional margins. SAMSI Workshop on QMC, August 2017
  • 36. QMC tutorial 36 Scrambled net properties Using σ2 = (f(x) − µ)2 dx If Then N.B. f ∈ L2 Var(ˆµ) = o(1/n) even if VHK(f) = ∞ f ∈ L2 Var(ˆµ) Γt,b,s σ2 /n for t = 0, Γ exp(1) . = 2.718 ∂1,2,...,s f ∈ L2 Var(ˆµ) = O(log(n)s−1 /n3 ) O (1997,2008) Γ < ∞ rules out (log n)s−1 catastrophe at finite n. Loh (2003) has a CLT for t = 0 (and fully scrambled points). Geometrically Scrambling Faure breaks up the diagonal striping of the nets. Scrambling Sobol’ points moves the full / empty blocks around. Improved rate RMSE is O(n−1/2 ) better than QMC rate (cancellation). Holds for nested uniform and nested linear scrambles. SAMSI Workshop on QMC, August 2017
  • 37. QMC tutorial 37 Scrambling vs shifting Consider n = 2m points in [0, 1). QMC van der Corput points (i − 1)/n for i = 1, . . . , n. • · · · · · · · · · |• · · · · · · · · ·|• · · · · · · · · ·|• · · · · · · · · ·| Shift Shift all points by U ∼ U(0, 1) with wraparound. Get one point in each [(i − 1)/n, i/n) · · · • · · · · · · |· · · • · · · · · ·|· · · • · · · · · ·|· · · • · · · · · ·| Scramble Get a stratified sample, independent xi ∼ U[(i − 1)/n, i/n) · · · • · · · · · · |· · · · · · · · • ·|· · · · · • · · · ·|· · • · · · · · · ·| Random errors cancel yielding an O(n−1/2 ) improvement. SAMSI Workshop on QMC, August 2017
  • 38. QMC tutorial 38 Higher order nets Results from Dick, Baldeaux Start with a net zi ∈ [0, 1)2s dimensions. ‘Interleave’ digits of two variables to make a new one: zi,2j = 0.g1g2g3 · · · zi,2j+1 = 0.h1h2h3 · · · −→ xi,j = 0.g1h1g2h2 · · · . Error is O(n−2+ ) under increased smoothness: ∂2s ∂x2 1...∂x2 d f Scrambling gets RMSE O(n−2−1/2+ ) Even higher Start with ks dimensions interleave down to s. Get O(n−k+ ) and O(n−k−1/2+ ) (under still higher smoothness) Upshot Very promising. Cost: many inputs and much smoothness. Starting to be used in PDEs. Tutorial of Frances Kuo. SAMSI Workshop on QMC, August 2017
  • 39. QMC tutorial 39 The curse of dimension Curse of dimension: larger d makes integration harder. Cr M = f : [0, 1]d → R j ∂αj ∂x αj j f M, j αj = r, αj 0 Bahkvalov I: For any x1, . . . , xn ∈ [0, 1]d there is f ∈ Cr M with |ˆµn − µ| kn−r/d Ordinary QMC like r = d Bahkvalov II: Random points can’t beat RMSE O(n−r/d−1/2 ) Ordinary MC like r = 0 SAMSI Workshop on QMC, August 2017
  • 40. QMC tutorial 40 What if we beat those rates? Sometimes we get high accuracy for large d. It does not mean we beat the curse of dimensionality. Bahkvalov never promised universal failure. Only the existence of hard cases. We may have just had an easy, non-worst case function. Two kinds of easy • Truncation: only the first s d components of x matter • Superposition: the components only matter “s at a time” Either way f might not be “fully d-dimensional”. SAMSI Workshop on QMC, August 2017
  • 41. QMC tutorial 41 Dimensional decomposition For u = {j1, j2, . . . , jr} ⊂ 1:d ≡ {1, 2, . . . , d} let xu = (xj1 , . . . , xjr ) xi,u = (xij1 , . . . , xijr ) Via ANOVA or other method, write f(x) = u⊆1:d fu(xu) Then ˆµ − µ = u⊆1:d 1 n n i=1 fu(xi,u) − fu(xu) dxu |ˆµ − µ| u⊆1:d D∗ n(x1,u, . . . , xn,u) × fu Often D∗ n(xi,u) D∗ n(xi) for small |u|. If also fu is small for large u, then all the terms are small. SAMSI Workshop on QMC, August 2017
  • 42. QMC tutorial 42 Studying the good cases Two main tools to describe it • Weighted spaces and tractability • ANOVA and effective dimension Implications Neither causes the curse to be lifted. They describe the happy circumstance where the curse did not apply. Both leave important gaps described below. I’ll raise as an open problem later. SAMSI Workshop on QMC, August 2017
  • 43. QMC tutorial 43 Weighted spaces Hickernell (1996), Sloan & Wozniakowski (1998), Dick, Kuo, Novak, Wasilkowski, many more See tutorial by Hickernell ∂u ≡ j∈u ∂ ∂xj assume ∂1:d f exists Inner product, weights γu > 0 f 2 γ = u⊆1:d 1 γu [0,1]u [0,1]−u ∂u f(x) dx−u 2 dxu Function ball Bγ,C = {f | f γ C} Small γu =⇒ only small ∂u f in ball. Product weights γu = j∈u γj where γj decrease rapidly with j. Now f ∈ Bγ,C implies ∂u f small when |u| large. Many more choices: Dick, Kuo, Sloan (2013) SAMSI Workshop on QMC, August 2017
  • 44. QMC tutorial 44 ANOVA and effective dimension Caflisch, Morokoff & O (1997) ANOVA: f(x) = u⊆1:d fu(x) Often f is dominated by its low order interactions. Then RQMC may make a huge improvement. Let σ2 u = Var(fu) variance component Truncation dim. s d u⊆1:s σ2 u 0.99 u⊆1:d σ2 u Superposition dim. s d |u| s σ2 u 0.99 u⊆1:d σ2 u Mean dimension u |u|σ2 u u σ2 u Liu & O (2006) Easier to estimate via Sobol’ indices. SAMSI Workshop on QMC, August 2017
  • 45. QMC tutorial 45 Open problem • ANOVA captures magnitude of the low dimensional parts but not their smoothness. Even when it verifies that f is dominated by low dimensional parts it does not assure small |ˆµ − µ|. • Weighted space models assure accurate estimation at least asymptotically. However, it is not easy to decide which weighted space to use. • Given a weighted space, there are algorithms to tune QMC points for it. • ANOVA approaches may support an approach to choosing good weights for a given problem, building on Sobol’ indices Sobol’++, Saltelli++, Prieur++, Kucherenko++ or active subspaces Constantine++ The problems are about how to combine the approaches and when / whether a resulting adaptive algorithm will be effective. SAMSI Workshop on QMC, August 2017
  • 46. QMC tutorial 46 Description vs engineering Effective dimension and weighted spaces both describe settings where f can be integrated well. Some authors try to engineer changes in f to make it more suited to QMC. E.g., cram importance into first few components of x. MC vs QMC MC places lots of effort on variance reduction. For QMC we gain by reducing effective dimension. Or Hardy-Krause variation (but there asymptotics are slow). E.g., when turning u ∼ U[0, 1]d into z ∼ N(0, Σ) The choice of Σ1/2 affects QMC performance. Caflisch, Morokoff & O (1997) Acworth, Broadie & Glasserman (1998) Imai & Tan (2014) Best Σ can depend strongly on f. Papageorgiou (2002) SAMSI Workshop on QMC, August 2017
  • 47. QMC tutorial 47 Sampling Brownian motion Feynman-Kac/Brownian bridge First few variables define a ‘skeleton’. The rest fill in. 0.0 0.2 0.4 0.6 0.8 1.0 −1.0−0.50.00.5 Time t B(t) q q q q q q q q q q q q q q q q Brownian bridge construction of Brownian motion See also Mike Giles++ on multi-level MC. SAMSI Workshop on QMC, August 2017
  • 48. QMC tutorial 48 QMC ∩ MCMC • Replace IID ui inputs to MCMC by something better. • Completely uniformly distributed (CUD). Like entire period of a (small) RNG. • Very early refs: Chentsov (1967) Finite state space. Uses ’inversion.’ Beautiful coupling argument. (Resembles coupling from the past.) Sobol’ (1974) Has n × ∞ points xij ∈ [0, 1] Samples from a row until a return to start state, then goes to next row Gets rate O(1/n) · · · if transition probabilities are a/2b for integers a, b SAMSI Workshop on QMC, August 2017
  • 49. QMC tutorial 49 QMC ∩ MCMC For MCMC we follow one particle. 1) Step xi ← ψ(xi−1, vi) for vi ∈ (0, 1)d . 2) For v1 = (u1, . . . , ud), v2 = (ud+1, . . . , u2d) etc. n steps require u1, . . . , und ∈ (0, 1) 3) MC-MC uses ui ∼ U(0, 1) 4) MC-QMC uses ui CUD Reasons for caution 1) We’re using 1 point in [0, 1]nd with n → ∞ 2) Our xi won’t be Markovian SAMSI Workshop on QMC, August 2017
  • 50. QMC tutorial 50 MCMC ≈ QMCT Method Rows Columns QMC n points d variables 1 d n → ∞ MCMC r replicates n steps 1 r n → ∞ QMC MCMC QMC based on equidistribution MCMC based on ergodicity SAMSI Workshop on QMC, August 2017
  • 51. QMC tutorial 51 Some results 1) Metropolis on CUD points consistent in finite state space O & Tribble (2005) 2) Weakly CUD random points ok Tribble & O (2008) 3) Consistency in continuous problems (Gibbs and Metropolis) Chen, Dick & O (2010) 4) Chen (2011) rate O(n−1+ ) for some smooth ARMAs 5) Mackey & Gorham find best results are for µ = E(xj). SAMSI Workshop on QMC, August 2017
  • 52. QMC tutorial 52 Sequential QMC JRSS-B discussion paper by Chopin & Gerber (2015) N particles in Rd for T time steps Advance xnt to xn,t+1 = φ(xnt, un,t+1) each u ∈ [0, 1)S . Do them all with Ut+1 ∈ [0, 1)N×S . Usually IID. For QMC We want rows of Ut+1 to be ‘balanced’ wrt position of xt Easy if d = S = 1 L´ecot & Tuffin (2004) Take Ut+1 ∈ [0, 1)N×2 Sort first column along xn,t. Update points with remaining column. What to do if d > 1? Alignment is tricky because Rd is not ordered. Related work by L’Ecuyer, L´ecot, L’Archevˆeque-Gaudet on array-RQMC. d-dimensional alignments. Huge variance reductions. Limited theory. SAMSI Workshop on QMC, August 2017
  • 53. QMC tutorial 53 Using Hilbert curves These are space-filling curves as m → ∞: m= 1 m= 2 m= 3 m= 4 Chopin & Gerber thread a curve through xn,t ∈ Rd Align particles via first column of RQMC points Ut+1 ∈ [0, 1)N×(1+S) . Use remaining S columns to advance them. Results Squared error is o(N−1 ). Good empirical results for modest d NB: I left out lots of details about particle weighting. SAMSI Workshop on QMC, August 2017
  • 54. QMC tutorial 54 Thanks • NSF support over the years, most recently: DMS-1407397 and DMS-1521145. • Co-leaders: Hickernell, Kuo, L’Ecuyer • SAMSI, especially Ilse Ipsen • Sue McDonald SAMSI Workshop on QMC, August 2017