Cis262 HMM
Cis262 HMM
start
0.45 0.55
0.7 0.75
0.3
Cold Hot
0.25
0.2 0.3
0.8 0.7
N D
Figure 4.1: Example of an HMM modeling the “drinking behavior” of a professor at the
University of Pennsylvania.
Also, from each of the states Cold and Hot, we have emis-
sion probabilities of producing the ouput N or D, and
these probabilities also sum to 1.
The matrix ⎛ ⎞
0.7 0.3
A=⎝ ⎠
0.25 0.75
the vector
⎛ ⎞
0.45
π=⎝ ⎠
0.55
where σ(qt) is some index from the set {1, . . . , n}, for
t = 1, . . . , T .
S = (q1, q2, q3, q4, q4, q6) =(Cold, Cold, Hot, Cold,
Hot, Hot),
we have
O = (N, D, N, N, N, D).
O = (O1, O2, . . . , OT ),
with Ot ∈ O for t = 1, . . . , T .
where ω(Ot) is some index from the set {1, . . . , m}, for
t = 1, . . . , T .
we have
start
0.4 0.6
0.6 0.7
0.4
Cold Hot
0.3
0.1 0.1
0.2 0.4
0.7 0.5
S M L
Figure 4.2: Example of an HMM modeling the temperature in terms of tree growth rings.
4.1. DEFINITION OF A HIDDEN MARKOV MODEL (HMM) 131
start
0.45 0.55
0.7 0.75
0.3
Cold Hot
0.25
0.2 0.3
0.8 0.7
N D
Figure 4.3: Example of an HMM modeling the “drinking behavior” of a professor at the
University of Pennsylvania.
4.1. DEFINITION OF A HIDDEN MARKOV MODEL (HMM) 133
T
%
Pr(S, O) = π(i1)B(i1, ω1) A(it−1, it)B(it, ωt).
t=2
Initially, we set
score(j, 1) = π(j)B(j, ω1), j = 1, 2,
and since ω1 = 1 we get score(1, 1) = 0.45 × 0.8 = 0.36
and score(2, 1) = 0.55 × 0.3 = 0.165.
Then
score(1, 2) = max{tscore(1, 1), tscore(2, 1)}
= max{0.2016, 0.0330} = 0.2016,
and
Since ω3 = 1, we get
and
Since ω4 = 2, we get
score(1, 4) = max{0.0158, 0.0009} = 0.0158,
and
score(2, 4) = max{0.0237, 0.0095} = 0.0237,
and pred(1, 4) = 1 and pred(2, 4) = 1.
i = 1, . . . , n.
• B = (B(i, j)) is an n × m matrix called the state ob-
servation probability matrix (also called confusion
matrix ), with
m
&
B(i, j) ≥ 0, 1 ≤ i, j ≤ n, and B(i, j) = 1,
j=1
i = 1, . . . , n.
142 CHAPTER 4. HIDDEN MARKOV MODELS (HMMS)
T
%
Pr(S, O) = π(i1)B(i1, ω1) A(it−1, it)B(it, ωt)
t=2
is maximal.
T
%
Pr(S, O) = π(i1)B(i1, ω1) A(it−1, it)B(it, ωt)
t=2
is maximal.
4.2. THE VITERBI ALGORITHM AND THE FORWARD ALGORITHM 149
σ −1(1) ❙❙❙
❙tscore(1,j)
❙❙❙
❙❙❙
❙❙❙
❙❙❙
❙❙❙
❙❙+
tscore(k,j)
σ −1(k) &
q❦, = σ −1(j)
❦❦❦
❦❦❦❦❦
❦
❦❦❦
❦❦❦
❦ tscore(n,j)
❦❦
❦❦❦
σ −1(n)
4.2. THE VITERBI ALGORITHM AND THE FORWARD ALGORITHM 151
begin
for j = 1 to n do
score(j, 1) = π(j)B(j, ω1)
endfor;
for t = 2 to T do
for j = 1 to n do
for k = 1 to n do
tscore(k) = score(k, t − 1)A(k, j)B(j, ωt)
endfor;
'
score(j, t) = k tscore(k)
endfor
endfor;
'
tprob = j score(j, T )
end
4.2. THE VITERBI ALGORITHM AND THE FORWARD ALGORITHM 159
start
0.13 0.87
0.33 0.9
0.67
Cold Hot
0.1
0.05 0.8
0.95 0.2
N D
Figure 4.4: Example of an HMM modeling the “drinking behavior” of a professor at Harvard.