Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
§ 4.9 ,Ma rkov Chains con td
Probability : A vector i n 112" whose entries a r e non-negative and a d d-
up to 1 i s called a probability vector.-
S t o c h a s t i c : Th i s i s a square ma t r i x whose columns a r e probability vectors.
M a r t h a : A Maka i chain i s a n infinite collection of probability vectors-
X o , I i , I I , I I , . . . of size n a n d a n x n stochastic matrix Psuch that
P e i = I , , P I , = I . , . . . , P I ; = I i ", . . . (% iisnitfadkdstatthee)
example: P = I:} ÷÷.%!:)
and#=/!!).
Then you get a Markov chain Tio, P i o , R i o , P t o , P "Fo , . . .
Nota t ion : T " = multiply t h e matrix P wi th itself i n t imes .
Steady S t a t e s : I f P i s a stochastic mat r i x (all columns a r e-
probability vectors) , then a steady state vector forP i s a probability vector I (oo. I t o b e i ts entries add u pt o 1 )Such t h a t
P I = I.
I n other words, I i s a steady state vector precisely i f i t i s a n eigenvector ofPwith eigenvalue 1 a n d i s also a probability vector.
Regularstochastiries : A stochastic matrix P i s regular i f#
Some power of P contains only strictly positiveentries.
es: "l÷÷÷÷÷I. ' " '"=l÷÷÷÷÷÷÷÷÷lO
not a l l entries a r e a l l entries a r e
strictly positive strictly positive
Non-example: P = (f f). Then for a l l n > O, P "= (fo,]
Since when yo u multiply a n identity matrix t o itself , you just get back theidentity mat r i x .
Conyerging vectors : A sequence of vectors (of the same size) I i , I > I , I ,-
o o o , Tin , . . . , converges t o a vector of i f a s k → o
the entries of Ti, get arbitrarily close t o § .
E.g: Consider the sequence of vectors (lo), [ "I] , ("30), o o o , [ "ok],...This sequence converges t o (g) because 11,→ o a s t a → o .
Amazing Theorem: Le t P b e a n n x n regular stochastic ma t r i x . Thenm u m
① P has a UNIQUE steady state vector § .② I f Tio i s anym initial state and I'µ, = P I n , b i o , 1,2, . . . ,
then the Markov chain (Tin), converges t o § a s k → s o .
Upshot : The initial state of Maka i chain (Tin)n has n o bearing o n t h e
long term behavior of t h e chain a s long a s the corresponding matrix
P i s a regular stochastic matrix.
Exerc ise : I s P= (oo.} to] a regular stochastic matrix? I f yes, find
i ts unique steady state vector.
So lu t ion : step 18 T o check P i s regular stochastic w e need t o find a
power of P whose entries a r e a l l positive o r shoo n o
such powers ex is t .
Let's check P ' : I :} 'off:} 'o)
I : : : :11%1=4%1 tool:tl:::LI::: 1%1=1%1 top.tl::p.
:O P '= f!:p: oo;]. s o P i s regular.
T o find the steady state vector, w e need t o find t h e unique probabilityvector i n the eigenspace E y .
F-1 : P - l I s = (oo?j ' -I] =/-off .!). Then I , = Nut (P-I I z ) .
(i!! !) r.fr?*, fo} 'o) → - o r a t x . t o ⇒ x , = ± .Linear 0 . 8
systemFree v a r : X zBasic " : X ,
ooo E , ={1×4%8) : → ⇐ x , <a}.
But w e need a probability vector i n E , .% ± + X , = l .
O - 8
⇒ x . (÷, t 1) = I ⇒ x - = ¥+7 = I F = # = I -
o ! × , = 1 ¥ = 44¥ = Ia. So the unique steady state valori s
filial.