Lecture 27

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Reasoning under Uncertainty

Instructors: Dr. Durgesh Singh


CSE Discipline, PDPM IIITDM, Jabalpur -482005
Reasoning under uncertainty

▪ Agents in the real world need to handle uncertainty, whether


due to partial observability, nondeterminism, or adversaries.
▪ An agent may never know for sure what state it is in now or
where it will end up after a sequence of actions.
Nature of Uncertain Knowledge

▪ Let us try to write rules for dental diagnosis using propositional


logic, so that we can see how the logical approach breaks down.
Consider the following simple rule:
Toothache ⇒ Cavity.
▪ The problem is that this rule is wrong.
▪ Not all patients with toothaches have cavities; some of them
have gum disease, swelling, or one of several other problems:
Toothache ⇒ Cavity ∨ GumProblem ∨ Swelling ∨ ……..
Nature of Uncertain Knowledge

▪ In order to make the rule true, we have to add an almost


unlimited list of possible problems. We could try turning the rule
into a causal rule:
Cavity ⇒ Toothache
But this rule is also not right; not all cavities cause pain.
Toothache and a Cavity are always not connected, so the
judgement may go wrong.
Nature of Uncertain Knowledge

▪ This is typical of the medical domain, as well as most other


judgmental domains: law, business, design, automobile repair,
gardening, dating, and so on.
▪ The agent’s knowledge can at best provide only a degree of
belief in the relevant sentences.
▪ Our main tool for dealing with degrees of belief is probability
theory.
▪ A logical agent believes each sentence to be true or false or has
no opinion, whereas a probabilistic agent may have a numerical
degree of belief between 0 (for sentences that are certainly
false) and 1 (certainly true).
Basic Probability Notation

▪ Random variables are typically divided into three kinds,


depending on the type of the domain:
▪ Boolean random variables, such as Cavity, have the domain
(true, false) or (1,0)
▪ Discrete random variables, take on values from a countable
domain. For example, the domain of Weather might be (sunny,
rainy, cloudy, snow).
▪ Continuous random variables (bounded or unbounded) take on
values from the real numbers. Ex: temp=21.4; temp<21.4 or
temp< 1.
Atomic events or sample points
▪ Atomic event: A complete specification of the state of the world
about which the agent is uncertain
▪ E.g., if the world consists of only two Boolean variables Cavity
and Toothache, then there are 4 distinct atomic events:
Cavity = false  Toothache = false
Cavity = false  Toothache = true
Cavity = true  Toothache = false
Cavity = true  Toothache = true
▪ Atomic events are mutually exclusive and exhaustive
▪ When two events are mutually exclusive, it means they cannot both occur at
the same time.
▪ When two events are exhaustive, it means that one of them must occur.
Axioms of Probability Theory

▪ All probabilities between 0 and 1


– 0 ≤ P(A) ≤ 1
– P(true) = 1
– P(false) = 0.
▪ The probability of disjunction is:
P(A B) = P(A)+ P(B)− P(A B)
Prior probability

▪ The unconditional or prior probability associated with a


proposition A is the degree of belief according to the absence of
any other information;
▪ It is written as P ( A ).
▪ For example, if the prior probability that I have a cavity is 0.1,
then we would write
P ( Cavity= true ) = 0.1 or P ( cavity ) = 0.1
▪ P ( A ) can be used only when there is no other information.
▪ As soon as some new information is known, we must reason with
the conditional probability of a given that new information.
Prior probability…
▪ Sometimes, we will want to talk about the probabilities of all the
possible values of a random variable.
▪ In that case, we will use an expression such as P(Weather), which
denotes a vector of values for the probabilities of each individual state
of the weather.
▪ Instead of writing these four equations
P ( Weather = sunny) = 0.7
P ( Weather= rain) = 0.2
P ( Weather= cloudy) = 0.08
P(Weather = snow) = 0.02
we may simply write: P( Weather) = (0.7,0.2,0.08,0.02) (Note that the
probabilities sum to 1 )
▪ This statement defines a prior probability distribution for the random
variable Weather.
Prior probability…
▪ Joint probability distribution for a set of random variables gives
the probability of every atomic event on those random variables
▪ P(Weather, Cavity) = a 4 × 2 matrix of values:

Weather = sunny rainy cloudy snow


Cavity = true 0.144 0.02 0.016 0.02
Cavity = false 0.576 0.08 0.064 0.08

▪ A full joint distribution specifies the probability of every atomic


event and is therefore a complete specification of one's
uncertainty about the world in question.
Conditional or posterior probability

▪ The notation used is P(a l b),where a and b are any proposition.


This is read as "the probability of a, given that all we know is b."
For example,
P(cavity l toothache) = 0.8
“indicates that if a patient is observed to have a toothache and no
other information is yet available, then the probability of the
patient's having a cavity will be 0.8.”
Conditional or posterior probability

▪ Conditional probabilities can be defined in terms of


unconditional probabilities.

P(a|b) = P (a ^ b)
P (b)

holds whenever P(b)>0


This equation can be written as
P(a^b) = P(a|b) * P(b) (which is called product rule)
Alternative way:
P(a^b) = P(b|a) * P(a)
Chain Rule/Product Rule

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy