1. The document proposes methods for estimating the marginal log-likelihood of latent variable models in an unbiased manner.
2. It discusses using Monte Carlo methods like MCMC and importance sampling to estimate the intractable integral in the marginal log-likelihood. Multilevel Monte Carlo can provide an unbiased estimate with fewer samples than standard Monte Carlo.
3. Stochastically Unbiased Marginalization Objective (SUMO) is introduced to provide an unbiased estimate of the marginal log-likelihood using a single sample. This involves weighting the importance weighted bound with a geometric distribution.
6. Outline
पลରͷෆภਪఆ๏
1. पลରࣗମΛਪఆ͢Δํ๏
• On Multilevel Monte Carlo Unbiased Gradient Estimation for Deep Latent
Variable Models (AISTATS 2021)
• Efficient Debiased Evidence Estimation by Multilevel Monte Carlo Sampling (UAI
2021)
2. पลରͷޯΛਪఆ͢Δํ๏
• Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent
Variable Models (ICLR 2020)
6
7. Outline
पลରͷෆภਪఆ๏
1. पลରࣗମΛਪఆ͢Δํ๏
• On Multilevel Monte Carlo Unbiased Gradient Estimation for Deep Latent
Variable Models (AISTATS 2021)
• Efficient Debiased Evidence Estimation by Multilevel Monte Carlo Sampling (UAI
2021)
2. पลରͷޯΛਪఆ͢Δํ๏
• Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent
Variable Models (ICLR 2020)
7
8. IWAE
Importance Weighted Autoencoder
ͰVAEͱҰகɼ Ͱ߸ཱ
log pθ (x) = log
𝔼
z(1),…,z(k)∼q(z)
[
1
k
k
∑
i=1
pθ (x, z(i)
)
q (z(i)
) ]
≥
𝔼
z(1),…,z(k)∼q(z)
[
log
1
k
k
∑
i=1
pθ (x, z(i)
)
q (z(i)
) ]
=
𝔼
z(1),…,z(k)∼q(z) [ℒk (θ, q)]
k = 1 k → ∞
19. Outline
पลରͷෆภਪఆ๏
1. पลରࣗମΛਪఆ͢Δํ๏
• On Multilevel Monte Carlo Unbiased Gradient Estimation for Deep Latent
Variable Models (AISTATS 2021)
• Efficient Debiased Evidence Estimation by Multilevel Monte Carlo Sampling (UAI
2021)
2. पลରͷޯΛਪఆ͢Δํ๏
• Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent
Variable Models (ICLR 2020)
19