hmm intro

Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

The Hidden Markov Model

The Hidden Markov Model


Markov chain
• Useful when we need to compute a probability for a
sequence of observable events

If the events hidden? then

A hidden Markov model (HMM) allows us to talk about both


observed events (like words that we see in the input) and hidden
events (like part-of-speech tags) that we think of as causal factors in
our probabilistic model.
The Hidden Markov Model
• A hidden Markov model is a tool for representing probability
distributions over sequences of observations.
• It is a statistical model with hidden states.

• A set of states each of which has a limited number of transitions


and emissions,

• Each transition between states has an assigned probability,

• Each model starts from the start state and ends in the end state.
Hidden Markov Model

Hidden Markov Model

For Example,
Evidence –E and X is the hidden variable,
→E--X
Markov Models
Assume there are Three types of weather

Rainy
Sunny Foggy

Weather prediction is about the what be weather tomorrow,


Based on the observations on the passed

Weather at day n is,

qn - depends on known weather of the past days


(qn-1, qn-2,..)
Markov Models
We want to find that :

P(qn | qn-1, qn-2,.., q1)


This means that given the past whether what is the probability of any possible weather
of today.

For Example,
if we knew the weather for the last three days was:
{ sunny, sunny, rainy }

The probability that tomorrow would be foggy is:


P(q4= Foggy| q3 =Rainy, q2=sunny, q1=sunny )
The weather depends on today (First-order Markov Model)
Markov Model Assumption

If Weather yesterday was rainy and today is


Foggy what is the probability that tomorrow
it will be sunny

0.6
The Markov Model
0.4 0.6
0.3
Cloud
Rain y
0.2 Quest: What is the probability that the weather
for the next 7 days will be Sun-Sun-Rain- Rain-
0.1 0.2
0.3 0.1 Sun-Cloudy-Sun “ when today is sunny?
Sunny
States : S1: Rainy, S2 :cloudy , S3: Sunny
P(O|Model)=P(S3, S3, S1, S1, S3, S2, S3 |Model)
0.8

= P(S3 ).P(S3| S3) . P(S3| S3) . P(S1| S3) . P(S1| S1) . P(S3| S1) . P(S2| S3). P(S3| S2)

= π3 . α33 . α33 . α31 . α11. α13. α32. α23

= 1 (0.8)(0.8)(0.1)(0.4)(0.3)(0.1)(0.2) = 1.536 x10-4


The Markov Model
The initial State Probability matrix

0.5
π = πi = 0.2
0.3

The State transition Probability matrix


0.6 0.2 0.2
A = aij = 0.5 0.3 0.2
0.4 0.1 0.5
Quest: What is the probability of 5 consecutive
up days?
P(O|Model)=P(1,1,1,1,1|Model) = π3 . α11 . α11 . α11 . α11
= 0.5 (0.6)(0.6)(0.6)(0.6) = 0.0648
Observations
• An observation is termed as the data which is known and can be
observed.

Two HIDDEN states, ‘Rainy’ and ‘Sunny’

‘Walk’, ‘Shop’, and ‘Clean’- OBSERVATIONS


Five important components of HMM
• Initial probability distribution- The initialization distribution defines
each hidden variable in its initial condition at time t=0 (the initial
hidden state).

• One or more hidden states

• Transition probability distribution - The transition matrix is used to


show the hidden state to hidden state transition probabilities.

• A sequence of observations

• Emission probabilities - A sequence of observation likelihoods, also


called emission probabilities.
Components of HMM
The Hidden Markov Model
An HMM is specified by the following components:
Three fundamental problems in HMM
The situation...
•Casino has two dice, one fair (F) and one loaded (L)

•Probabilities for the fair die: P(1) = P(2) = P(3) = P(4) = P(5) = P(6) = 1/6

•Probabilities for the loaded die: P(1) = P(2) = P(3) = P(4) = P(5) = 1/10; P(6) = ½

•Before each roll, the casino player switches from the fair die to the loaded die (or vice
versa) with probability 1/20

The game...
•You bet $1

•You roll (always with the fair die)

•Casino player rolls (maybe with the fair die, maybe with the loaded die)

•Player who rolls the highest number wins $2


Three fundamental problems in HMM
Dishonest Casino HMM
P(1 | F) = 1/6 P(1 | L) = 1/10
P(2 | F) = 1/6 P(2 | L) = 1/10
0.95 0.05 P(3 | F) = 1/6 P(3 | L) = 1/10
0.95
P(4 | F) = 1/6 P(4 | L) = 1/10
P(5 | F) = 1/6 P(5 | L) = 1/10
FAIR LOADED P(6 | F) = 1/6 P(6 | L) = 1/2

0.05
•Evaluation: What is the probability of a sequence of outputs of an HMM?

•Decoding: Given a sequence of outputs of an HMM, what is the most probable


sequence of states that the HMM went through to produce the output?

•Learning: Given a sequence of outputs of an HMM, how do we estimate the


parameters of the model?
Evaluation Question
Suppose the casino player rolls the following sequence...

1245526462146146136136661
6646616366163661636165156
15115146123562 344

Probability = 1.3 x 10-35 (Note (1/6)67 = 7.309139054 x 10-53.)

How likely is this sequence given our model of how the casino operates?
Decoding Question

1245526462146146136136661
6646616366163661636165156
15115146123562 344

LOADED
FAIR

What portion of the sequence was generated with the fair die, and what portion
with the loaded die?
Learning Question

1245526462146146136136661
6646616366163661636165156
15115146123562 344
LOADED

P(6) = 0.64
FAIR

How “loaded” is the loaded die? How “fair” is the fair die? How often does the
casino player change from fair to loaded, and back? That is, what are the
parameters of the model?
Three fundamental problems in HMM

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy