Ch-5 Uncertain Knowledge and Reasoning

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Introduction to AI

Chapter-5 Uncertain Knowledge and Reasoning

1
Reasoning
under
uncertainty
Topics

Planning

Quantifying
Uncertainty

Probabilistic
Reasoning

Probabilistic
Reasoning
over Time

Making
2

Simple
Decisions
Logic and Uncertainty

• Major problem with logical-agent approaches:


• Agents almost never have access to the whole truth
about their environments.

• Very often, even in simple worlds, there are important


questions for which there is no Boolean answer.
• In that case, an agent must reason under uncertainty.

• Uncertainty also arises because of an agent’s


incomplete or incorrect understanding of its
environment.

3
Logic and Uncertainty
Uncertainty is Bad for Agents based on Logic
Example: Catching a Flight
Let action At = leave for airport t minutes before flight
Question: Will At get me there on time?
Problems:
• Partial observability (road state, other drivers' plans, etc.)
• Noisy sensors (traffic reports)
• Uncertainty in action outcomes (flat tire, etc.)
• Complexity of modeling and predicting traffic

A purely logical approach leads to conclusions that are too


weak for effective decision making:
• A25 will get me there on time if there is no accident on the
bridge and it doesn't rain and my tires remain intact, etc., etc.
• AInf guarantees to get there in time, but who lives forever?
Reasoning under Uncertainty
• A rational agent is one that makes rational decisions
(in order to maximize its performance measure)
• A rational decision depends on:
• the relative importance of various goals
• the likelihood they will be achieved
• the degree to which they will be achieved

5
Sources of Uncertainty

Probabilistic assertions summarize effects of

• failure to enumerate exceptions,


Laziness qualifications, etc.

• lack of explicit theories, relevant


Ignorance facts, observability, etc.

Randomness • Inherently random behavior


Planning- Making Decisions under Uncertainty
Probabilities: Suppose the agent believes the following:
𝑃(𝐴25 𝑔𝑒𝑡𝑠 𝑚𝑒 𝑡ℎ𝑒𝑟𝑒 𝑜𝑛 𝑡𝑖𝑚𝑒) = 0.04
𝑃(𝐴90 𝑔𝑒𝑡𝑠 𝑚𝑒 𝑡ℎ𝑒𝑟𝑒 𝑜𝑛 𝑡𝑖𝑚𝑒) = 0.80
𝑃(𝐴120 𝑔𝑒𝑡𝑠 𝑚𝑒 𝑡ℎ𝑒𝑟𝑒 𝑜𝑛 𝑡𝑖𝑚𝑒) = 0.99
𝑃(𝐴1440 𝑔𝑒𝑡𝑠 𝑚𝑒 𝑡ℎ𝑒𝑟𝑒 𝑜𝑛 𝑡𝑖𝑚𝑒) = 0.9999

Which action should the agent choose?


• Depends on preferences for missing flight vs. time spent waiting

The agent should choose the action that maximizes the expected utility:
𝑎𝑟𝑔𝑚𝑎𝑥𝐴𝑡 [ 𝑷(𝑨𝒕 𝒔𝒖𝒄𝒄𝒆𝒆𝒅𝒔) 𝑼(𝑨𝒕 𝒔𝒖𝒄𝒄𝒆𝒆𝒅𝒔) +
𝑷(𝑨𝒕 𝒇𝒂𝒊𝒍𝒔) 𝑼(𝑨𝒕 𝒇𝒂𝒊𝒍𝒔) ]

• Utility theory is used to represent and infer preferences.


• Decision theory = probability theory + utility theory
Quantifying Uncertainty- Degrees of Belief
• In several real-world domains the agent’s
knowledge can only provide a degree of belief in
the relevant sentences
• The main tool for handling degrees of belief is
Probability Theory

• The use of probability summarizes the uncertainty


that stems from our laziness or ignorance about the
domain

8
Probability Theory
• Probability Theory makes the same ontological
commitments as First-order Logic:
• Every sentence Ꝕ is either true or false

• The degree of belief that Ꝕ is true is a number P


between 0 and 1
• P(Ꝕ) = 1 −→ Ꝕ is certainly true
• P(Ꝕ) = 0 −→ Ꝕ is certainly not true.
• P(Ꝕ) = 0.65 −→ Ꝕ is true with a 65% chance

9
Random variables
We describe the (uncertain) Just like variables in CSP’s,
state of the world using random variables take on
random variables values in a domain D
• Denoted by capital letters • Domain values must be mutually
exclusive and exhaustive

• R: Is it raining? • R ∈ {True, False}


• W: What’s the weather? • W ∈ {Sunny, Cloudy, Rainy, Snow}
• Die: What is the outcome of rolling • Die ∈ {(1,1), (1,2), … (6,6)}
two dice?
• V: What is the speed of a car (in • V ∈ [0, 200]
MPH)?

A random variable is a function that maps from the domain of possible


worlds 𝛀 (called sample space) to the real numbers 𝑹 written as
𝑋: Ω → 𝑅
Events and Propositions

Probabilistic statements are defined over Events are described using


events, world states or sets of states propositions:
• “It is raining” • R = True
• “The weather is either cloudy or • W = “Cloudy”  W =
snowy” “Snowy”
• “The sum of the two dice rolls is 11” • D  {(5,6), (6,5)}
• “My car is going between 30 and 50 • 30  S  50
miles per hour”

Notation:
• For random variables: P(X = x), or P(x) for short, is the
probability of the event that random variable X has taken on
the value x.
• For propositions: P(A = true), P(a) is the probability of the set
of possible worlds in which proposition A holds.
Axioms of Probability
Probability Theory is governed by the following axioms:
1. All probabilities are real values between 0 and 1:
for all Ꝕ, 0 ≤ P(Ꝕ) ≤ 1
2. Valid propositions have probability 1
P(True) = P( ⍺ ∨ ¬ ⍺) = 1
3. The probability of disjunction is defined as follows:
P(⍺ ∨ 𝛽) = P(⍺) + P(𝛽) − P(⍺ ∧ 𝛽 )
Atomic events
• Atomic event: a complete specification of the state
of the world, or a complete assignment of domain
values to all random variables

• Atomic events are mutually exclusive and


exhaustive

• E.g., if the world consists of only two Boolean


variables Cavity and Toothache, then there are 4
distinct atomic events:
Cavity = false Toothache = false
Cavity = false  Toothache = true
Cavity = true  Toothache = false
Cavity = true  Toothache = true
Joint probability distributions
• A joint distribution is an assignment of probabilities to every
possible atomic event
Atomic event P
Cavity = false Toothache = false 0.8
Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05

• We use for the joint probability distribution the notation


P(Cavity, Toothache)
• The probabilities of all possible atomic events sum to 1.
Marginal probability distributions

• Sometimes we are only interested in one variable.


This is called the marginal distribution P(Y)
P(Cavity, Toothache)
Cavity = false Toothache = false 0.8
Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05
Prob. Distr.

P(Cavity) P(Toothache)
Marginal

Cavity = false ? Toothache = false ?


Cavity = true ? Toothache = true ?
Marginal probability distributions

• Suppose we have the joint distribution P(X,Y) and


we want to find the marginal distribution P(Y)

𝑃(𝑋 = 𝑥) = 𝑃 (𝑋 = 𝑥 ∧ 𝑌 = 𝑦1 ) ∨ ⋯ ∨ (𝑋 = 𝑥 ∧ 𝑌 = 𝑦𝑛 )
𝑛

= 𝑃 (𝑥, 𝑦1 ) ∨ ⋯ ∨ (𝑥, 𝑦𝑛 ) = ෍ 𝑃(𝑥, 𝑦𝑖 )


𝑖=1

• General rule: to find P(X = x), sum the probabilities


of all atomic events where X = x. This is called
“summing out” or marginalization.
Marginal probability distributions

• Suppose we have the joint distribution P(X,Y) and


we want to find the marginal distribution P(Y)
P(Cavity, Toothache)
Cavity = false Toothache = false 0.8
Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05
Prob. Distr.

P(Cavity) P(Toothache)
Marginal

Cavity = false 0.8+0.1 = 0.9 Toothache = false 0.8+0.0.5= 0.85


Cavity = true 0.05+0.05=0.1 Toothache = true 0.1+0.05= 0.15
Conditional probability

• Probability of cavity given toothache:


P(Cavity = true | Toothache = true)
𝑃(𝑎∧𝑏) 𝑃(𝑎,𝑏)
• For any two events a and b, 𝑃(𝑎|𝑏) = =
𝑃(𝑏) 𝑃(𝑏)

P(a  b)

P(a) P(b)
Conditional probability 𝑃(𝑎 ∧ 𝑏)
𝑃(𝑎|𝑏) =
𝑃(𝑏)
P(Cavity, Toothache)
Joint Prob. Distr.

Cavity = false Toothache = false 0.8


Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05

P(Cavity) P(Toothache)
Prob. Distr.
Marginal

Cavity = false 0.9 Toothache = false 0.85


Cavity = true 0.1 Toothache = true 0.15

• What is P(Cavity = true | Toothache = false)?


0.05 / 0.85 = 0.059
• What is P(Cavity = false | Toothache = true)?
0.1 / 0.15 = 0.667
Conditional distributions 𝑃(𝑎|𝑏) =
𝑃(𝑎 ∧ 𝑏)
𝑃(𝑏)

P(Cavity, Toothache)
Cavity = false Toothache = false 0.8
Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05

A conditional distribution is a distribution over the values of one


variable given fixed values of other variables
P(Cavity | Toothache = true) P(Cavity | Toothache = false)
Cavity = false 0.667 Cavity = false 0.941
Cavity = true 0.333 Cavity = true 0.059

P(Toothache | Cavity = true) P(Toothache | Cavity = false)


Toothache= false 0.5 Toothache= false 0.889
Toothache = true 0.5 Toothache = true 0.111
Bayes’ Rule

• The product rule gives us two ways to factor a


joint distribution:
𝑃 𝐴, 𝐵 = 𝑃 𝐴 𝐵 𝑃 𝐵 = 𝑃 𝐵 𝐴 𝑃(𝐴)
Posterior Prob. Prior Prob.
𝑃 𝐵 𝐴 𝑃(𝐴)
• Therefore, 𝑃(𝐴|𝐵) =
𝑃(𝐵) Rev. Thomas Bayes
(1702-1761)

• Why is this useful?


• Can get diagnostic probability P(cavity | toothache)
from causal probability P(toothache | cavity)
• We can update our beliefs based on evidence.
• Important tool for probabilistic inference .
Example: Getting Married in the Desert

Marie is getting married tomorrow, at an outdoor ceremony in


the desert. In recent years, it has rained only 5 days each year
(5/365 = 0.014). Unfortunately, the weatherman has
predicted rain for tomorrow. When it actually rains, the
weatherman correctly forecasts rain 90% of the time. When it
doesn't rain, he incorrectly forecasts rain 10% of the time.
What is the probability that it will rain on Marie's wedding?

𝑃 𝐵 𝐴 𝑃(𝐴)
𝑃(𝐴|𝐵) =
𝑃(𝐵)
Example: Getting Married in the Desert
Marie is getting married tomorrow, at an outdoor ceremony in
the desert. In recent years, it has rained only 5 days each year
(5/365 = 0.014). Unfortunately, the weatherman has
predicted rain for tomorrow. When it actually rains, the
weatherman correctly forecasts rain 90% of the time. When it
doesn't rain, he incorrectly forecasts rain 10% of the time.
What is the probability that it will rain on Marie's wedding?

𝑃(Predict|Rain)𝑃(Rain)
𝑃(Rain|Predict) =
𝑃(Predict)
𝑃(Predict|Rain)𝑃(Rain)
=
𝑃(Predict|Rain)𝑃(Rain) + 𝑃(Predict|¬Rain)𝑃(¬Rain)
0.9 ∗ 0.014
= = 0.111
0.9 ∗ 0.014 + 0.1 ∗ 0.986
The weather forecast updates
our belief from 0.014 to 0.111
Example: Breast Cancer Screening

• 1% of women at age forty who participate in


routine screening have breast cancer. 80% of
women with breast cancer will get positive
mammography. 9.6% of women without breast
cancer will also get positive mammography. A
woman in this age group had a positive
mammography in a routine screening. What is the
probability that she actually has breast cancer?
Example: Breast Cancer Screening
1% of women at age forty who participate in routine
screening have breast cancer. 80% of women with breast
cancer will get positive mammography. 9.6% of women
without breast cancer will also get positive mammography. A
woman in this age group had a positive mammography in a
routine screening. What is the probability that she actually
has breast cancer?
𝑃(Positive|Cancer)𝑃(Cancer)
𝑃(Cancer|Positive) =
𝑃(Positive)
𝑃(Positive|Cancer)𝑃(Cancer)
=
𝑃(Positive|Cancer)𝑃(Cancer) + 𝑃(Positive|¬Cancer)𝑃(¬Cancer)
0.8 ∗ 0.01
= = 0.0776
0.8 ∗ 0.01 + 0.096 ∗ 0.99

The positive test updates our


belief from 0.01 to 0.0776

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy