19
HIDDEN MARKOV CHAINS HIDDEN MARKOV CHAINS Prof. Alagar Rangan Prof. Alagar Rangan Dept of Industrial Engineering Dept of Industrial Engineering Eastern Mediterranean Eastern Mediterranean University University North Cyprus North Cyprus Source: Probability Models Source: Probability Models Sheldon M.Ross Sheldon M.Ross

HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Embed Size (px)

Citation preview

Page 1: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

HIDDEN MARKOV CHAINSHIDDEN MARKOV CHAINS

Prof. Alagar RanganProf. Alagar RanganDept of Industrial Engineering Dept of Industrial Engineering Eastern Mediterranean UniversityEastern Mediterranean UniversityNorth CyprusNorth Cyprus

Source: Probability ModelsSource: Probability ModelsSheldon M.RossSheldon M.Ross

Page 2: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

MARKOV CHAINSMARKOV CHAINS

Toss a coin repeatedly .Toss a coin repeatedly .

Denote Head=1Denote Head=1

Tail =0Tail =0

Let Yn=outcome of nLet Yn=outcome of nthth Toss Toss

P( Yn=1) = pP( Yn=1) = p

P( Yn=0) = qP( Yn=0) = q

Y1,Y2,… are iid random variables.Y1,Y2,… are iid random variables.

Page 3: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Sn = Y1 + Y2 + …+ YnSn = Y1 + Y2 + …+ Yn

Sn is the accumulated number of Heads in Sn is the accumulated number of Heads in the first n trials. the first n trials.

Sn ~ Markov chainSn ~ Markov chain ; ;

Time n=0,1,2,…Time n=0,1,2,…

States j=0,1,2,…States j=0,1,2,…

qjjP SS nn

1

YSS nnn 11

pjjP SS nn

1

1

Page 4: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

)(

,...,,

1

1211

sayijP

ijP

pXXXXXXX

ijnn

nnnn

Xn ~ Markov ChainXn ~ Markov Chain

One step Transition One step Transition

Probability MatrixProbability Matrix

...

...

...

...

...

...

222120

121110

020100

ppppppppp

P

0

1

2

.

.

.

0 1 2 . . .

Page 5: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

pXXn

ijmnmijP

)(

n - step Transition Probabilitiesn - step Transition Probabilities

The corresponding Matrix The corresponding Matrix

Simple Results:Simple Results: (a) (a) (b) Expected sojourn time in a state (b) Expected sojourn time in a state (c) Steady state Probability (c) Steady state Probability

......

...)(11

)(10

...)(01

)(00

)(

pp

pp

P nn

nn

n

PPnn )(

jj

....,.10

, P

Page 6: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

)(

,...,,

1

1211

sayijP

ijP

pXXXXXXX

ijnn

nnnn

Xn ~ Markov ChainXn ~ Markov Chain

0 1 2 . . .0 1 2 . . .

One step Transition One step Transition

Probability MatrixProbability Matrix

...

...

...

...

...

...

222120

121110

020100

ppppppppp

P

Page 7: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Examples:Examples:Weather ForecastingWeather ForecastingStates : Dry day, Wet day, state of the nth dayStates : Dry day, Wet day, state of the nth day

Communication SystemCommunication SystemStates : signals 0 , 1States : signals 0 , 1 signals leaving the nth stage of the system.signals leaving the nth stage of the system.

Moods of a ProfessorMoods of a ProfessorStates: cheerful , ok , unhappy.States: cheerful , ok , unhappy. (C) (O) (U)(C) (O) (U) Mood of the Professor on the nth day.Mood of the Professor on the nth day.

X n

1 2 3

X n

pq

qp

P

1

0

1

X n

X n 1

0

X n

6.4.

2.8.

P

Dry Wet

Dry

Wet

X n

X n 1

5.3.2.

3.4.3.

1.4.5.

P

UX n

X n 1

C O

U

C

O

Page 8: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Hidden Markov Chain Hidden Markov Chain ModelsModels Let Xn be a Markov chain with one step Transition Let Xn be a Markov chain with one step Transition

Probability Probability MatrixMatrix

Let S be a set of signals.Let S be a set of signals. A signal from S is emitted each time the Markov chain A signal from S is emitted each time the Markov chain

enters a state.enters a state.

If the Markov chain enters state j, then the signal s is If the Markov chain enters state j, then the signal s is emitted with probability , withemitted with probability , with

pij

P

jsP 1 SsjsP

jspjsP XS 11

jspjsP XSSXSXS nnn

,,...,,,,12211

Page 9: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

The above model in which the sequence of signals The above model in which the sequence of signals S1,S2,…S1,S2,…

is observed while the sequence of the underlying Markov is observed while the sequence of the underlying Markov chain states X1,X2,… is unobserved is called a chain states X1,X2,… is unobserved is called a hidden hidden Markov chain Markov chain Model.Model.

signalsignal

timetime

state of the state of the chainchain

X nX n 1

jsp itp

i j

pij

Page 10: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

ExamplesExamples : : Production ProcessProduction Process StateState SignalSignal

Good state(1)Good state(1)

Poor state(2)Poor state(2)

acceptable quality .99acceptable quality .99

ProductioProduction n ProcessProcess

acceptable .04acceptable .04

unacceptable .01unacceptable .01

unacceptable .unacceptable .9696

10

1.90.P

1 2

1

2

Page 11: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Moods of the ProfessorMoods of the Professor

Professor Professor

Condition of a Patient subject to Therapy.Condition of a Patient subject to Therapy.

Signal ProcessingSignal Processing

C

O

U

Grades High averageGrades average

PatientImproving

Deteriorating

Red Cell count high

Red Cell count low

Signals sent

0

1

Signals received as 0

Signals received as 1

Page 12: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Let be the random vector of the first n Let be the random vector of the first n signalssignals..

For a fixed sequence of signals, let and For a fixed sequence of signals, let and

It can be shown thatIt can be shown that

Now starting with Now starting with

SSSS n

n,...,,21

ssss nn,...,,21

sSXF n

n

nnjPj ,

in

n

n

nn

n

nn

n

n

i

j

P

jPjP

FF

sSsSXsSX

,

.....3,2,1.1 njii

nnnpjPj FsF

ippiPj ssSXF i 11

1

11,

1

2

Page 13: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

We can recursively determineWe can recursively determine

using , which will determine . using , which will determine .

Note Note

We can also compute the above using We can also compute the above using backward recursion usingbackward recursion using

iuptoii FFF n,...,,

32

2 1

j

nn

njP FsS

iPi XsSsSB knnkkk

,...,

11

Page 14: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Example:Example:

LetLet

Let the first 3 items produced be a,u,aLet the first 3 items produced be a,u,a

96.204.2

99.11.1

01.

19.

2112

2211

apup

apup

PPPP

8.11

XP

192.)96)(.2(.22

792.)99)(.8(.11

,,

121

111

3

spFspF

s

p

p

aua

Page 15: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

iandi FF 32Similarly calculating usingSimilarly calculating using

Predicting the states.Predicting the states.

Suppose the first observed n signals are Suppose the first observed n signals are

We wish to predict the first n states of the Markov We wish to predict the first n states of the Markov chain using this data. chain using this data.

364.)2()1(

)1(),,(1

33

333

FF

FsX auaP

2

sss nn,...,1

Page 16: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Case 1Case 1

We wish to maximize the expected number of states We wish to maximize the expected number of states that are correctly predicted.that are correctly predicted.

For each k=1,2,…,n , we calculateFor each k=1,2,…,n , we calculate

choose that j which maximizes choose that j which maximizes

the above as the predictor of . the above as the predictor of .

sSX n

n

kjP

X k

Page 17: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Case 2Case 2 A different problem arises if we regard the sequence A different problem arises if we regard the sequence

of states as a single entity.of states as a single entity.

For instance in signal processing while For instance in signal processing while

may be actual message sent , would be may be actual message sent , would be what is received. Thus the objective is to predict the what is received. Thus the objective is to predict the actual message in its entirety.actual message in its entirety.

XXX n,...,,21

SSS n,...,,21

Page 18: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

Let our problem is to find the Let our problem is to find the

sequence of states that maximizes sequence of states that maximizes

To solve the above we letTo solve the above we let

XXXX kk,...,,21

iii k,...,,21

sS

sSiiiX

sSiiiX

n

n

n

n

nn

n

n

nn

P

P

P

,...,,

,...,,

21

21

sSXiiiXPV k

k

kkkkj

iiij

k

,,,...,,1211

,...,,max

121

Page 19: HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon

We can show using probabilistic argumentsWe can show using probabilistic arguments

Starting withStarting with

We can recursively determine for each .We can recursively determine for each .

This procedure is known as This procedure is known as Viterbi Algorithm.Viterbi Algorithm.

jpji SpsSXPV j 11111,

ijj VpsPV kiji

kk 1.max

jV nj