Quantifying Uncertainty: Week 5
Quantifying Uncertainty: Week 5
Quantifying Uncertainty: Week 5
Week 5
Quantifying Uncertainty
LEARNING OUTCOMES
At the end of this session, students will be able to:
o LO3 : Demonstrate how to achieve a goal through a sequence of actions called planning
o LO4 : Apply various techniques to an agent when acting under certainty and how to process
natural language and other perceptual signs in order that an agent can interact
intelligently with the real world
LEARNING OBJECTIVE
1. Acting Under Uncertainty
2. Basic Probability Notation
3. Inference Using Full Joint Distributions
4. Independence
5. Probability and Bayes’ Theorem
6. Summary
ACTING UNDER UNCERTAINTY
o Agent may need to handle uncertainty, whether due to partial
observability, nondeterminism, or combination of the two. An
agent may never know for certain what state it’s in or where it
will end up after a sequence of actions.
o The agent’s knowledge can at best provide only a degree of
belief in the relevant sentences.
o Main tool for dealing with degree of belief is probability theory.
o Probability provides a way of summarizing the uncertainty that
come from laziness and ignorance, thereby solving the
quantification problem.
ACTING UNDER UNCERTAINTY
Subjective probability:
o Probabilities relate propositions to agent's own state of
knowledge
e.g., P(A25 | no reported accidents) = 0.06
o These are not assertions about the world
o For example if you have six persons for tennis, then the
number of pairings for singles tennis is
BASIC PROBABILITY NOTATION
Basic Probability
o If we identify the set of all possible outcomes as the
"sample space" and denote it by S, and label the desired
event as E, then the probability for event E can be written
• Probability of disjoint
called (inclusive-exclusive principle)
BASIC PROBABILITY NOTATION
Example:
o To 100 child, asked “what the meaning of traffic light”,
and the answer is :
75 child know the meaning of the red lamp
35 child know the meaning of the yellow lamp
50 child know the meaning of both
o Therefore :
P(red yellow) = P(red) + P(yellow) – P(red yellow)
P(red yellow) = 0.75 + 0.35 – 0.5 = 0.6
INFERENCE USING FULL
JOINT DISTRIBUTIONS
o A complete specification of the state of the world
about which the agent is uncertain
o E.g., if the world consists of only two Boolean variables
Cavity and Toothache, then there are 4 distinct atomic events:
P(cavity toothache)
= 0.108 + 0.012 + 0.072 + 0.008 + 0.016 + 0.064 = 0.28
INFERENCE USING FULL
JOINT DISTRIBUTIONS
Inference by Enumeration
o Start with the joint probability distribution:
o Then the required summation of joint entries is done by summing out the
hidden variables:
o P(Y | E = e) = α P(Y,E = e) = αΣh P(Y,E= e, H = h)
o The terms in the summation are joint entries because Y, E and H together
exhaust the set of random variables
o Obvious problems:
1. Worst-case time complexity O(dn) where d is the largest entry
2. Space complexity O(dn) to store the joint distribution
3. How to find the numbers for O(dn) entries?
INDEPENDENCE
o P (Toothache, Catch, Cavity, Weather),
which has 2 x 2 x2 x 4 = 32 entries
o For Example, how are
P(toothache, catch, cavity, cloudy) and P(toothache, catch, cavity)
p(chickenp ox/spots)
(0,8) * (0,4) 0,32
0,327
(0,8) * (0,4) (0,3) * (0,7) (0,9) * (0,5 0,98
o or in distribution form :