29
Lecture 24: Hidden Markov Models Department of Electrical Engineering Princeton University January 8, 2014 ELE 525: Random Processes in Information Systems Hisashi Kobayashi

Lecture 24: Hidden Markov Models - Hisashi Kobayashi

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Lecture 24:Hidden Markov Models

Department of Electrical EngineeringPrinceton University

January 8, 2014

ELE 525: Random Processes in Information Systems

Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 2

In some applications of a Markov chain model, we may not be able to directly observe a sequence of states, or may not even know the structure of the Markov model and the model parameters.

The observable variable may be a probabilistic function of the underlying Markov state. Such a model is called a hidden Markov model (HMM).

In this chapter we study how to estimate states (or a sequence of states) and HMM parameters such as state transition probabilities based on observable data.

01/08/2014 Hisashi Kobayashi 3

01/08/2014 Hisashi Kobayashi 4

In the figure of next slide, we show the state transition diagram of time-homogenous Markov chain, and its trellis diagram.

01/08/2014 Hisashi Kobayashi 5

We denote the state sequence in the period Ƭ defined in (20.2)

01/08/2014 Hisashi Kobayashi 6

There is a , where

We

01/08/2014 Hisashi Kobayashi 7

01/08/2014 Hisashi Kobayashi 8

Note that

01/08/2014 Hisashi Kobayashi 9

01/08/2014 Hisashi Kobayashi 10

But,

We should replace

01/08/2014 Hisashi Kobayashi 11

01/08/2014 Hisashi Kobayashi 12

01/08/2014 Hisashi Kobayashi 13

01/08/2014 Hisashi Kobayashi 14

is

We assume that the channel output Yt has the same alphabet as the channel input (the encoder output) Ot , i.e.,

01/08/2014 Hisashi Kobayashi 15

A discrete memory channel is called a binary symmetric channel (BSC) if

p(1|0)=p(0|1)=Ɛ

01/08/2014 Hisashi Kobayashi 16

The following four basic problem arise when we apply an HMM to characterize a system of our interest.

01/08/2014 Hisashi Kobayashi 17

We can write

01/08/2014 Hisashi Kobayashi 18

or

which suggests a computationally efficient procedure, called the forward recursion algorithm or forward algorithm for calculating of (20.50) .

01/08/2014 Hisashi Kobayashi 19

An alternative derivation of (20.52) can be made by defining the forward variable,

The 2nd line follows, since is a Markov chain, and depends only on .The 3rd line follows, because the HMM should depends only on .

01/08/2014 Hisashi Kobayashi 20

Define the forward vector variable as a row vector:

Then (20.55) can be written as

The computational complexity is merely in contrast to of the direct enumeration method based on (20.50).

01/08/2014 Hisashi Kobayashi 21

01/08/2014 Hisashi Kobayashi 22

Consider the time-reversed version of the above procedure by rearranging (20.52) as

We define the backward variable by

Define the backward vector variable as a column vector

Then we obtain the backward recursion formula:

01/08/2014 Hisashi Kobayashi 23

Problem 20.11 solution:

01/08/2014 Hisashi Kobayashi 24

The computational algorithm based on this recursion is called the forward-backward algorithm (FBA). (Problem 20.13)

01/08/2014 Hisashi Kobayashi 25

Problem 20.13 Solution:

01/08/2014 Hisashi Kobayashi 26

01/08/2014 Hisashi Kobayashi 27

01/08/2014 Hisashi Kobayashi 28

01/08/2014 Hisashi Kobayashi 29