18
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL: www.cavs.msstate.edu/hse/ies/publications/seminars/msstate/2005/particle_fil tering/ Introduction To Particle Filtering: Integrating Bayesian Models and State Space Representations

Introduction To Particle Filtering:

Embed Size (px)

DESCRIPTION

Introduction To Particle Filtering:. Integrating Bayesian Models and State Space Representations. Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems - PowerPoint PPT Presentation

Citation preview

Sanjay Patil and Ryan IrwinIntelligent Electronics Systems

Human and Systems EngineeringCenter for Advanced Vehicular Systems

URL: www.cavs.msstate.edu/hse/ies/publications/seminars/msstate/2005/particle_filtering/

Introduction To Particle Filtering: Integrating Bayesian Models and State Space Representations

Page 2 of 22Introduction to Particle Filtering

Abstract

• Conventional approaches to speech recognition use:

Gaussian mixture models to model spectral variation; Hidden Markov models to model temporal variation.

• Particle filtering:

is based on a nonlinear state space representation; does not require Gaussian noise models; can be used for prediction or filtering of a signal;

Approximates the target probability distribution (e.g. amplitude of speech signal);

also known as survival of the fittest, the condensation algorithm, and sequential Monte Carlo filters.

Page 3 of 22Introduction to Particle Filtering

State-Space Equation and State-Variable Equation

State-Space Equation:

kkk

kkk

uJXHY

uGXFX

1

kkk

kkk

EXHY

VGXFX

1

State-Variable Equation:

• Parameters required are:

F, H, G, X0, p(X0), noise statistics, covariance terms,

Vk and Ek are noise terms

• The calculation from X1, to X2 to Xn goes through a prediction and update stage with observation used to update the (predicted value) states.

• Usually, states are unknown and hidden, so indirect method is required to calculate them from observations

• Output term: Xk (hidden / unknown)

• Parameters required are:

F, H, G, X0, U0(input)

• The calculation from X1, to X2 to Xn goes through only one stage. Idea is to find observations Yk.

• Most of the times, States are not hidden from the user / programmer

• Output term: Yk (output / not hidden)

Both involve matrix algebra, carry same names, similar meanings

Page 4 of 22Introduction to Particle Filtering

Hidden Markov Model and Nonlinear State-Space ModelNonlinear State-Space Model:

)|(

)|( 1

kkk

kkk

XYpY

XXpX

)|(

)|( 1

kkk

kxk

XYpY

XXpX

Hidden Markov Model:

• .Generalization of HMM

• The calculation from X1, to X2 to Xn goes through a prediction and update stage with observation used to update the (predicted value) states

• Use of particles to approximate the target distribution (if particle filtering is implemented).

• Output – depends on prob. formulation

• Can involve variable length of observations

• Models Gaussian Mixtures.

• The calculation is based on forward-backward algorithm for evaluation (scoring)

• Finite number of means and covariance used to model the target distribution.

• Output – depends on prob. formulation

• Most of the times, HMMs work on a uniform length of frame (data)

Both involve Bayes rules for state computation

Page 5 of 22Introduction to Particle Filtering

Phase Lock Loop“A device which continuously tries to track the phase of the incoming signal…”

1 2 1 2sin[ ( ) ( )] sin[2 ( ) ( )]

2 2

d di o i o

k A A k A At t wt t t

Low-pass filter, h(t)

Voltage ControlledOscillator

Si(t)

So(t)

Phase detector

Se(t)

1 2sin[ ( ) ( )]

2

di o

k A At t

1sin[ ( )]iA wt t

2cos[ ( )]oA wt t

•Nonlinear feedback system

•Consider a first order PLL, :

sin ( )

sin ( )

oe

e ie

AK tt

AK tt t

AKsin( )

0

t

-

( )i t ( )e t

( )o t

0

( )sin ( )t

o eAK h t x x dx .

( ) ( )h t t

Page 6 of 22Introduction to Particle Filtering

Applications

Most of the applications involve tracking

• Ice Hockey Game – tracking the players demo*Ref.* Kenji Okuma, Ali Taleghani, Nando de Freitas, Jim Little and David Lowe. A Boosted Particle Filter: Multitarget Detection and Tracking. 8th European Conference on Compute Vision, ECCV 2004, Prague, Czech Republic. http://www.cs.ubc.ca/~nando/publications.html

Ref.^M. Gabrea, “Robust adaptive Kalman Filtering-based speech enhancement algorithm,” ICASSP 2004, vol 1, pp I-301-4, May 2004.

K. Paliwal, “Estimation of noise variance from the noisy AR signal and its application in speech enhancement,” IEEE transaction on Acoustics, Speech, and Signal Processing, vol 36, no 2, pp 292-294, Feb 1988.

At IES – NSF funded project, particle filtering has been used for: • Time series estimation for speech signal^

• Speaker VerificationSpeech verification algorithm based on HMM and Particle Filtering algorithm.

Page 7 of 22Introduction to Particle Filtering

Time Series Prediction

Implementation :

Problem statement : in presence of noise, estimate the clean speech signal.

Order defines the number of previous samples used for prediction.

Noise calculation is based on Modified Yule-Walker equations.

yt – speech amplitude in presence of noise, xt – cleaned speech signal.

Feature

Extraction

Order of Prediction

Model Estimation

State Predicts State Updates

tttt

ttttt

exHy

vGxFx

1

Number of particles

part of the figure (ref): www.bioid.com/sdk/docs/About_Preprocessing.htm

Page 8 of 22Introduction to Particle Filtering

Speaker Verification

Hypothesis

Particle filters approximate the probability distribution of a signal

If large number of particles are used, it approximates the pdf better

Attempt will be made to use more Gaussian mixtures as compared to the existing system

Trade-off between number of passes and number of particles

Feature

Extraction

Claimed ID

Classifier Decision

Accept

Reject

Speaker Model Imposter Model

Changes will be made here…

Page 9 of 22Introduction to Particle Filtering

Pattern Recognition Applet

• Java applet that gives a visual of algorithms implemented at IES

• Classification of Signals • PCA - Principal Component Analysis

• LDA - Linear Discrimination Analysis

• SVM - Support Vector Machines

• RVM - Relevance Vector Machines

• Tracking of Signals • LP - Linear Prediction

• KF - Kalman Filtering

• PF – Particle Filtering

URL: http://www.cavs.msstate.edu/hse/ies/projects/speech/software/demonstrations/applets/util/pattern_recognition/current/index.html

Page 10 of 22Introduction to Particle Filtering

Classification Algorithms – Best Case

• Data sets need to be differentiated

• Classifying distinguishes between sets of data without the samples

• Algorithms separate data sets with a line of discrimination

• To have zero error the line of discrimination should completely separate the classes

• These patterns are easy to classify

Page 11 of 22Introduction to Particle Filtering

Classification Algorithms – Worst Case

• Toroidals are not classified easily with a straight line

• Error should be around 50% because half of each class is separated

• A proper line of discrimination of a toroidal would be a circle enclosing only the inside set

• The toroidal is not common in speech patterns

Page 12 of 22Introduction to Particle Filtering

Classification Algorithms – Realistic Case

• A more realistic case of two mixed distributions using RVM

• This algorithm gives a more complex line of discrimination

• More involved computation for RVM yields better results than LDA and PCA

• Again, LDA, PCA, SVM, and RVM are pattern classification algorithms

• More information given online in tutorials about algorithms

Page 13 of 22Introduction to Particle Filtering

Signal Tracking Algorithms – Kalman Filter

• Predicts the next state of the signal given prior information

• Signals must be time based or drawn from left to right

• X-axis represents time axis

• Algorithms interpolate data ensuring periodic sampling

• Kalman filter is shown here

Page 14 of 22Introduction to Particle Filtering

Signal Tracking Algorithms – Particle Filter

• The model has realistic noise

• Gaussian noise is actually generated at each step

• Noise variances and number of particles can be customized

• Algorithm runs as previously described

1. State prediction stage

2. State update stage

• Each step gives a collection of possible next states of signal

• The collection is represented in the black particles

• Mean value of particles becomes the predicted state

Page 15 of 22Introduction to Particle Filtering

Summary

• Particle filtering promises to be one of the nonlinear techniques.

• More points to follow

Page 16 of 22Introduction to Particle Filtering

References

• S. Haykin and E. Moulines, "From Kalman to Particle Filters," IEEE International Conference on Acoustics, Speech, and Signal Processing, Philadelphia, Pennsylvania, USA, March 2005.

• M.W. Andrews, "Learning And Inference In Nonlinear State-Space Models," Gatsby Unit for Computational Neuroscience, University College, London, U.K., December 2004.

• P.M. Djuric, J.H. Kotecha, J. Zhang, Y. Huang, T. Ghirmai, M. Bugallo, and J. Miguez, "Particle Filtering," IEEE Magazine on Signal Processing, vol 20, no 5, pp. 19-38, September 2003.

• N. Arulampalam, S. Maskell, N. Gordan, and T. Clapp, "Tutorial On Particle Filters For Online Nonlinear/ Non-Gaussian Bayesian Tracking," IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174-188, February 2002.

• R. van der Merve, N. de Freitas, A. Doucet, and E. Wan, "The Unscented Particle Filter," Technical Report CUED/F-INFENG/TR 380, Cambridge University Engineering Department, Cambridge University, U.K., August 2000.

• S. Gannot, and M. Moonen, "On The Application Of The Unscented Kalman Filter To Speech Processing," International Workshop on Acoustic Echo and Noise, Kyoto, Japan, pp 27-30, September 2003.

• J.P. Norton, and G.V. Veres, "Improvement Of The Particle Filter By Better Choice Of The Predicted Sample Set," 15th IFAC Triennial World Congress, Barcelona, Spain, July 2002.

• J. Vermaak, C. Andrieu, A. Doucet, and S.J. Godsill, "Particle Methods For Bayesian Modeling And Enhancement Of Speech Signals," IEEE Transaction on Speech and Audio Processing, vol 10, no. 3, pp 173-185, March 2002.

• M. Gabrea, “Robust Adaptive Kalman Filtering-based Speech Enhancement Algorithm,” ICASSP 2004, vol 1, pp. I-301-I-304, May 2004.

• K. Paliwal, :Estiamtion og noise variance from the noisy AR signal and its application in speech enhancement,” IEEE transaction on Acoustics, Speech, and Signal Processing, vol 36, no 2, pp 292-294, Feb 1988.

Page 17 of 22Introduction to Particle Filtering

References (for PLL):

• Modern Digital and Analog Communication Systems B.P. Lathi, Oxford University Press, Second Edition.• Andrew J. Viterbi, “Phase-Locked Loop Dynamics in the presence of noise by Fokker-Planck Techniques”, Proceedings of the IEEE, 1963.

Page 18 of 22Introduction to Particle Filtering

References (HMM and particle):

• M. Andrews, “Learning and Inference in Nonlinear State-Space Models,” (in preparation).• V. Digalakis, J. Rohlicek, and M. Ostendorf, “,” IEEE transactions on Speech and Audio Processing, vol 1, no 4, October 1993, pp 431-434.