論文紹介 Adaptive metropolis algorithm using variational bayesian

Preview:

Citation preview

Mbalawata, Isambi S., et al.

 Computational Statistics & Data Analysis 83 (2015): 101-115.

Adaptive Metropolis Algorithm Using Variational Bayesian Adaptive Kalman Filter

Presenter : Shuuji Mihara

Shuuji Mihara
従来法との比較, 選んだモチベーションを書くこと

Abstract2

This Paper propose a new adaptive MCMC algorithm called Variational Bayesian adaptive Metropolis(VBAM).

The VBAM algorithm updates the proposal covariance matrix using the Variational Bayesian adaptive Kalman filter(VB-AKF).

In the simulated experiments, VBAM perform better than the AM algorithm of Harrio et al.

In the real data examples, VBAM produced results consisted with results reported literature.

3Index

1. Introduction – What’s Statistical Problem?

2. MCMC

3. Adaptive MCMC Methods

4. VB-AKF

5. VBAM

6. Numerical Experiments

7. Conclusion

4Index

1. Introduction – What’s Statistical Problem?

2. MCMC

3. Adaptive MCMC Methods

4. VB-AKF

5. VBAM

6. Numerical Experiments

7. Conclusion

5What’s Statistical Problem? (1)-Modeling-

Linear Regression State Space Model

𝒚=𝒘𝑇 𝒙+𝝐 {𝒙𝑡=𝐴𝒙 𝑡−1+𝜼𝒚 𝑡=𝐻 𝒙𝑡+𝝐

6What’s Statistical Problem? (1)-Modeling-

Linear Regression State Space Model

𝒚=𝒘𝑇 𝒙+𝝐 {𝒙𝑡=𝐹 𝒙𝑡 −1+𝜼𝒚 𝑡=𝐻 𝒙 𝑡+𝝐

Parameter

7What’s Statistical Problem?(2)-Parameter Estimation-

Point Estimation

• Maximum Likelihood

→ EM algorithm

• Maximum a Posteriori (MAP)

Interval Estimation

• Bayes method

Variational Bayes

Marcov Chain Monte Carlo

𝜃

𝜃posterior distribution

8Index

1. Introduction – What’s Statistical Problem?

2. MCMC

3. Adaptive MCMC Methods

4. VB-AKF

5. VBAM

6. Numerical Experiments

7. Conclusion

9MCMC(Malkov Chain Monte Carlo Methods)

For many models of practical interest, it will be infeasible to evaluate the posterior distribution or indeed to compute expectations .

We need approximate scheme = MCMC

Metropolis Algorithm

10Computing Expectation by MCMC

𝑬 [ 𝑓 ]=∫ 𝑓 (𝑧 )𝑝 (𝜃 )𝑑𝜃

𝑬 [ 𝑓 ]≈ 1𝑛∑𝑠=1

𝑛

𝑓 (𝜃(𝑠))

Computing Expectation is difficult

Sampling by MCMC

11Metropolis Algorithm

http://visualize-mcmc.appspot.com/2_metropolis.html

1. Initialize 2. For

I. set set

Metropolis Algorithm :

12Metropolis Algorithm

http://visualize-mcmc.appspot.com/2_metropolis.html

1. Initialize 2. For

I. set set

Metropolis Algorithm :

proposed Gaussian distribution

13Index

1. Introduction – What’s Statistical Problem?

2. MCMC

3. Adaptive MCMC Methods

4. VB-AKF

5. VBAM

6. Numerical Experiments

7. Conclusion

14Adaptive Metropolis algorithm

The AM algorithm :

1. Initialize 2. For

I. set set

II. using following equation

/ where is the dimension of

Introduced by recursion formula

15Other adaptive algorithm

• regeneration-based adaptive algorithm [Gilks et al 1998]

• adaptive independent Metropolis-Hastings algorithm[Holden et al 2009]

• Robust adaptive Metropolis(RAM) [Vihola 2012]

• MCMC integrated with differential evolution [Vrugt et al 2009]

etc…

16Index

1. Introduction – What’s Statistical Problem?

2. MCMC

3. Adaptive MCMC Methods

4. VB-AKF

5. VBAM

6. Numerical Experiments

7. Conclusion

17Variational Bayesian Adaptive Metropolis Algorithm

The VB-AM algorithm :

1. Initialize 2. For

I. set set

II. using VB-AKF update step

18Kalman Filter (1)

State Space Model

𝒙𝑘 𝑁 (𝐴𝑘−1𝒙𝑘−1 ,𝑄𝑘−1)𝒚 𝑘 𝑁 (𝑦𝑘 𝒙𝑘−1 , Σ𝑘)

19Kalman Filter (2)

true data observation

Gaussian Noise

Estimate by Kalman Filter

20Kalman Filter(3)

known )objective : }

Prediction Step

(8)

Update Step

(9)

Initialize

algorithm

IterateFor

21Recursive Least Square(RLS)

If the Kalman filter reduces to Recursive Least Square

𝑥

𝑦

22VB-AKF(Variational Bayes Adaptive Kalman Filter)

objective :

known unknown

Initialize

Prediction Step

Update Step Iterateuntil is convergence

IterateFor

algorithm

23Variational Bayes

Computing is intractable.

free-form Variational Bayesian approximation

𝑝 (𝑥𝑘 , Σ𝑘|𝑦1 :𝑘¿≈𝑄𝑥 (𝑥𝑘)𝑄Σ(Σ𝑘)

¿𝑵 (𝒙𝑘|𝒎𝑘 ,𝑷𝑘 ¿ 𝑰𝑾 (Σ𝑘|𝑣𝑘 ,𝑽 𝑘¿

24heuristic dynamic for the covariances

Such kind of dynamical model is hard to construct explicitly

et al. proposed heuristic dynamic for the covariances

𝑣𝑘−=𝜌 (𝑣𝑘− 1−𝑑−1 )+𝑑+1

Σ𝑘−=𝑩Σ𝑘− 1

− 𝑩𝑇(21)

25VB-AKF(Variational Bayes Adaptive Kalman Filter)

objective :

known unknown

Initialize

Prediction Step

Update Step Iterateuntil is convergence

IterateFor

algorithm

26Index

1. Introduction – What’s Statistical Problem?

2. MCMC

3. Adaptive MCMC Methods

4. VB-AKF

5. VBAM

6. Numerical Experiments

7. Conclusion

27Variational Bayesian Adaptive Metropolis Algorithm

The VB-AM algorithm :

1. Initialize 2. For

I. set set

II. using VB-AKF update step

28Numerical Experiment 5.1

29Numerical Experiment 5.2

30Numerical Experiment 5.3

31Numerical Experiment 5.4

32Numerical Experiment 5.5

33Index

1. Introduction – What’s Statistical Problem?

2. MCMC

3. Adaptive MCMC Methods

4. VB-AKF

5. VBAM

6. Numerical Experiments

7. Conclusion

Conclusion34

This Paper propose a new adaptive MCMC algorithm called Variational Bayesian adaptive Metropolis(VBAM).

The VBAM algorithm updates the proposal covariance matrix using the Variational Bayesian adaptive Kalman filter(VB-AKF).

In the simulated experiments, VBAM perform better than the AM algorithm of Harrio et al.

In the real data examples, VBAM produced results consisted with results reported literature.

35Discussion

The advantage of the proposed method is that it has more parameters to tune , which gives more freedom.

The computational requirements of VBAM method while the usual AM is .

But these operations are still quite cheap compared with MCMC sampling.

36Future works

replacing Kalman filter with non-linear Kalman filters (ex) extended Kalman filter , Unscented Kalman filter

particle filter (Rao-Blackwellized) could be used for estimate the noise covariance.

Compare proposal adaptation method using different kinds of filters.

37State Space Model(1)

State Space Model

𝑥2 𝑥𝑇𝑥1

𝑦 1 𝑦 2 𝑦 𝑇

latent variable

observed variable

sys-eq

obs-eq