Tip:
Highlight text to annotate it
X
[Thrun] So now let's return to hidden Markov models.
Those are really the subject of this class.
Let's again use the rainy and sunny example just to keep it simple.
These are the transition probabilities as before.
Let's assume for now that the initial probability of rain is 0.5;
hence, the probability of sun at time 0 is 0.5.
The key modification to go to hidden Markov model is that this state is actually hidden.
I cannot see whether it's raining or it's sunny.
Instead I get to observe something else.
Suppose I can be happy or grumpy
and happiness or grumpiness is being caused by the weather.
So rain might make me happy or grumpy,
and sunshine makes me happy or grumpy
but with vastly different probabilities.
If it's sunny, I'm just mostly happy, 0.9.
There's a 0.1 chance I might still be grumpy for some other reason.
If it's rainy, I'm only happy with 0.4 probability and with 0.6 I'm grumpy.
In fact, living in California I can attest that these are actually not wrong probabilities.
I love the sun over here.
Suppose I observe that I'm happy on day 1.
A question that we can ask now is what is the so-called posterior probability
for it raining on day 1 and what's the posterior probability for it being sunny on day 1?
What's the probability of rain on day 1 given that I observed that I was happy on day 1?
This is being answered using Bayes rule,
so this is the probability of being happy given that it rains
times the probability that it rains over the probability of being happy.
We know the probability of rain at day 1 based on our Markov state transition model.
In fact, let's just calculate it.
The probability of rain on day 1 is the probability it was rainy on day 0
and it led to a self transition from rain to rain from day 0 to day 1
plus the probability it was sunny on day 0 times the probability that sun led to rain over here.
If you can plug in all these numbers to obtain 0.4,
you can just easily verify this.
So we know this guy over here is 0.4.
This guy over here is 0.4 again, but now it's this 0.4 over here.
The probability of being happy on a rainy day is 0.4.
This guy over here resolves to 0.4 times 0.4
plus the same situation with sunny in time 1
where the prior is 0.6 and the happiness factor is 0.9.
And that gives us the entire expression is 0.229.
Let's interpret the 0.229 in the context of the question we asked.
We know that at time 0 it was raining with half a chance.
If you look at the state transition diagram, it's more likely to be sunny afterwards
because it's more likely to flip from rain to sun than sun to rain.
In fact, we worked out that the probability of rain at a time step later was only 0.4,
so it was 0.6 sunny.
But now that I saw myself being happy, my probability of rain was further lowered
from 0.4 to 0.229.
And the reason why the probability went down is if you look at happiness,
happiness is much more likely to occur on a sunny day than it is to occur on a rainy day.
And when you work this in using Bayes rule and total probability,
you would find just the fact that it was at happiness at time 1
makes your belief of it being rainy go down from 0.4 to 0.229.
This is a wonderful example of applying Bayes rule
in this really relatively complicated hidden Markov model.