Contact session6
Contact session6
By
BITS Team AI
Pilani
Pilani | Dubai | Goa | Hyderabad
BITS
Pilani
Pilani | Dubai | Goa | Hyderabad
Contact Session-6
Probability, Bayesian Networks and Hidden Markov Models
Facts
Delhi is the capital of India
You can reach Bangalore Airport from MG Road within 90 mins if
you go by route A.
” Would you call that as a fact?
Are you sure this would be true always?
Uncertainty
You can reach Bangalore Airport from MG Road within 90 mins if you go by
route A.
There is uncertainty in this information due to partial observability and non
determinism
Agents should handle such uncertainty
Product rule :
Example
In a factory there are 100 units of a certain product, 5 of which are defective. We
pick three units from the 100 units at random. What is the probability that none of
them are defective?
Let Ai as the event and i=1,2,3
Probability Distribution
Instead, we define the probability as a function of the value the variable can
take
P(Weight = x) = Normal(x | mean = 60, std = 10)
Dependencies
among RVs
Procession Weather
@ Delhi
Find the
Conditional
Independences
Road
weekend Use ML to get the
best Linearization
among RVs
block
Late
for
BGLR
airport
Example Bayesian Net #3
F ~F
Y ~Y
Identify the R.Vs
T
Political T
F Y P ~P Festival RallY
T T Dependencies
among RVs
T F D ~D
F T Procession Weather
T
@ Delhi Find the
F F Conditional
Independences
P R ~R
W ~W
T Road Use ML to get the
weekend T best Linearization
among RVs
F
block
R W C ~C
W A ~A Construct the Bayes
Net
T T Cars T
All bars
T F route
are F
d
F T full Encode the Local
dependencies by
CPT
F F
Late for
C L ~L BGLR
airport
T
37
Partial Observability
Agent maintains a belief state representing the current possible world states
Transition Model: Using belief state and transition model, the agent can how
the world might evolve in next time step
Sensor Model: With the observed percepts and sensor model, the agent can
update the belief state
Degree of belief
Earlier, the states were captured as
Facts in Logic which can take either True/False
𝑆1 = 𝑆𝑢𝑛𝑛𝑦 , 𝑆2 =
𝑅𝑎𝑖𝑛𝑦, 𝑆3 = 𝐶𝑙𝑜𝑢𝑑𝑦.
Contd..
Contd..
state sequence notation: 𝑞1, 𝑞2, 𝑞3, 𝑞4, 𝑞5, … . ., where 𝑞𝑖 𝜖
{𝑆𝑢𝑛𝑛𝑦, 𝑅𝑎𝑖𝑛𝑦, 𝐶𝑙𝑜𝑢𝑑𝑦}.
Markov Property
Example
Given that today is Sunny, what’s the probability that tomorrow is
Sunny and the next day Rainy?
Example2
Assume that yesterday’s weather was Rainy, and today is
Cloudy, what is the probability that tomorrow will be Sunny?
WHAT IS A HIDDEN MARKOV MODEL (HMM)?
How to build a second-order HMM?
• Second-order HMM
• Current state only depends on previous 2 states
• Example
• Trigram model over POS tags
Markov Chain for Weather
What is the probability of 4 consecutive warm
days?
Sequence is
warm-warm-warm-warm
And state sequence is
3-3-3-3
P(3,3,3,3) =
3a33a33a33a33 = 0.2 x (0.6)3 = 0.0432
Hidden Markov Models
It is a sequence model.
Assigns a label or class to each unit in a sequence, thus
mapping a sequence of observations to a sequence
of labels.
Probabilistic sequence model: given a sequence of units
(e.g. words, letters, morphemes, sentences), compute a
probability distribution over possible sequences of labels
and choose the best label sequence.
This is a kind of generative model.
*
Hot Cold
<S> 0.8 0.2
A Hot Cold
* Hot Cold
<S> 0.8 0.2
A Hot Cold
* Best
Sequence:
Hot Cold
Hot🡪Cold🡪Cold
<S> 0.8 0.2
A Hot Cold
Hot Cold
<S> 0.8 0.2
HHH P(H)*P(1|H)*P(H|H)*P(3|H)*P(H|H)*P(1|H)
A Hot Cold
HHC
HCC Hot 0.7 0.3
CCC Cold 0.4 0.6
HCH
BITS Pilani, Pilani Campus