NLP Lecture 01-10-Hmm
NLP Lecture 01-10-Hmm
Md Mynoddin
26 Nov 2024
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 1/9
Introduction to Hidden Markov Models (HMMs)
Definition:
A Hidden Markov Model is a statistical model that represents a
sequence of observable events dependent on a sequence of hidden
states.
Key Characteristics:
Hidden States: States are not directly observable.
Markov Property: The next state depends only on the current state.
Observations: Each state generates an observable output.
Widely used in NLP, Speech Recognition, and Bioinformatics.
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 2/9
Components of an HMM
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 3/9
Example of an HMM
Consider a weather prediction problem:
States: Hidden states are Rainy and Sunny.
Observations: Observables are Walk, Shop, and Clean.
Transition Probabilities (A):
0.7 0.3
A=
0.4 0.6
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 4/9
Three Fundamental Problems of HMMs
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 5/9
Forward Algorithm
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 6/9
Viterbi Algorithm
Complexity: O(N 2 T ).
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 7/9
Applications of HMMs
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 8/9
Summary
HMMs are powerful tools for modeling sequences with hidden states.
Key components: states, observations, transitions, emissions, and
initial probabilities.
Solving HMM problems:
Likelihood estimation: Forward Algorithm.
Decoding: Viterbi Algorithm.
Parameter learning: Baum-Welch Algorithm.
Widely used in NLP, speech recognition, and bioinformatics.
Md Mynoddin (Assistant Professor (CSE), RMSTU)Hidden Markov Models (HMMs) 26 Nov 2024 9/9