Upload
shonda-jefferson
View
215
Download
1
Embed Size (px)
Citation preview
16.11.2011, Patrik Huber
One of our goals: Evaluation of the posterior p(Z|X)
Exact inference In practice: often infeasible to evaluate the
posterior distribution, or to compute expectations with respect to this distribution▪ Dimensionality of latent space too high▪ Posterior distribution has highly complex form,
expectations not analytically tractable▪ Integrations may not have analytical solutions
Approximate inference Deterministic approximation: Variational
algorithms (last week) Stochastic approximation: Monte Carlo methods
(today)
Pick a number uniformly at random. What’s the probability of hitting the red area?
What’s the probability of a dart thrown uniformly at random hitting the red area?
Really took off in 1940’s Motivation was nuclear
power, simulate samples (=neutrons), exploring the behavior of neutron chain reactions in nuclear devices
Stan Ulam / Von Neumann: Inspired by the idea of doing sampling using the newly developed electronic computing techniques (ENIAC)
50’s: Metropolis-Sampling
a P(m)
t .70
f .001
P(e) = 0.002
a P(j)
t .90
f .05
b e P(a)
t t .95
t f .94
f t .29
f f .001
Conditional probability tables:
P(b) = 0.001
When (inverse) CDF is known:
Matlab example
Sampling algorithms Can generate exact results if given
infinite computational resource (in contrast to variational inference)
Can be computationally demanding Difficult to know whether a sampling
scheme is generating independent samples from the required distribution