CS 70 Su23: Lecture 24 (6B)
Markov Chains II: hitting time, probability of state A before state B
A quick recap
Markov chain (it’s a directed graph), defined by:
Markov chains are amnesic: the transition probabilities only depend on the current state
Hitting time
Given a starting state in a Markov Chain, what is the expected number of steps (transitions) until we reach some other state?
Suppose that:
If we start at A, how many steps, on average, will it take to get to C?
A
B
C
First-step equations
Random variables are our friend. Let’s define:
If you only care about one of the states, you can simplify this (for example):
We are interested in computing E[TA], for some state A (to reach C).
You will also see this as a function (the notes uses β):
First-step equations
What is T(C)?
A
B
C
Example 1: first-step equations (with numbers)
What is T(A)?
What is T(B)?
A
B
C
½
⅘
⅕
7/10
3/10
½
Example 1: first-step equations (with numbers)
T(A) = 1 + ⅕T(A) + ⅘T(B)
T(B) = 1 + ½T(A) + ½T(C)
= 1+ ½T(A)
T(A) = 1 + ⅕T(A) + ⅘(1 + ½T(A))
= 1 + ⅕T(A) + ⅘ + ⅖T(A)
= 9/5 + ⅗T(A)
⅖T(A) = 9/5
T(A) = 9/2
⅕
7/10
3/10
½
A
B
C
½
⅘
First-step equations
In general, the expected number of steps to state Y from state X is:
In other words, we have to go somewhere, and with some probability, we end up at a different state, which takes some expected number of steps to get to state Y (if you start when you want to go, it takes 0 steps to get there).
(again, if Y is implied, you can omit it)
Example 2: more numbers
We want to reach state 1. What is T1(2)?
T(2) = 1 + ⅓T(1) + ⅔T(3)
= 1 + ⅔T(3)
T(3) = 1 + ½T(1) + ½T(3)
= 1 + ½T(3)
T(3) = 2
T(2) = 7/3
Example 3: Geometric… again
How can we represent a geometric distribution as a Markov Chain?
T(0) = 1 + pT(1) + (1 - p)T(0)
= 1 + T(0) - pT(0)
pT(0) = 1
T(0) = 1/p
0H
1H
p
1 - p
Example 4: Geometric… again (again)
Now what if we want to flip until we get 2 heads in a row?
0H
1H
p
1 - p
2H
p
1 - p
Example 4: Geometric… again (again)
T(0) = 1+ pT(1) + (1 - p)T(0)
T(1) = 1+ pT(2) + (1 - p)T(0)
= 1 + T(0) - pT(0)
0H
1H
p
1 - p
2H
p
1 - p
T(0) = 1 + p(1 + T(0) - pT(0)) + (1 - p)T(0)
T(0) = 1 + p + pT(0) - p2T(0) + T(0) - pT(0)
T(0) = (1 + p)/p2
Return time
We start at state X. What is the expected number of steps until we return to state X?
In other words, we have to go somewhere, and with some probability, we end up at a different state, which takes some expected number of steps to get to state 1.
Example 5: the numbers from before
We start in state 1. What is R1?
R1 = 1 + ¼ T(1) + ½T(2) + ¼T(3)
= 1 + 0 + ½ · 7/3 + ¼ · 2
= 6/6 + 7/6 + 3/6
= 16/6
= 8/3
Probability of state A before state B
Given a starting state, what is the probability we reach some state before some other state?
Suppose that:
If we start at A, what is the probability of reaching C before B?
A
B
C
Probability of state A before state B
Let P(X) = the probability that you reach state A before state B from state X (for some specified states A and B).
Example 6: probability of C before B
Returning to our example:
Let’s say we are interested in whether we visit C before B.
What is P(C)?
What is P(B)?
A
B
C
Example 6: probability of C before B
Returning to our example:
What is P(A)?
P(A) = Pr[A→A] · P(A)
+ Pr[A→B] · P(B)
+ Pr[A→C] · P(C)
= Pr[A→A] · P(A)
+ Pr[A→C] · P(C)
A
B
C
Probability of state A before state B
In general, the probability of reaching state A before state B, starting from state X, is:
(again, if A and B are implied, you can omit them)
Example 7: gamble responsibly
You are playing a game where you flip a coin with probability p. If it lands heads, you win $1. Otherwise, you lose $1. The game ends when you have $0. What is the probability that you earn $N before you run out of money?
Let’s write out the equations:
With the magic of algebra, we have P(n) = (1 - ((1 - p)/p)n) / (1 - ((1 - p)/p)N)