Module No. 05 Uncertain Knowledge and Reasoning

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 14

Module No.

05
Uncertain Knowledge and Reasoning
Uncertainty
• Uncertainty can be defined as many of the simplifications that are possible
with deductive inference are no longer valid.
• Uncertainty is unavoidable in everyday reasoning and in many real-world
domains.
Sources of uncertainty
• Imprecise knowledge: For example, the time that an event happened can
be known only approximately.
• Unreliable knowledge: For example, a measuring instrument can be biased
or defective.
Reasons for uncertainty
1. Laziness
2. Theoretical Ignorance
3. Practical Ignorance
Probability theory
1.1 Uncertain knowledge
p symptom(p, Toothache)  disease(p,cavity)
p sympt(p,Toothache) 
disease(p,cavity)  disease(p,gum_disease) …

p disease(p,cavity)  Symptom(p, toothache)

- Laziness

- theoretical ignorance
- practical ignorance
• Probability theory  degree of belief or plausibility of a statement – a numerical
measure in [0,1]
• Degree of truth – fuzzy logic  degree of belief
Definitions
• Unconditional or prior probability of A – the degree of belief in A in the absence of
any other information – P(A)
• A – random variable
• Probability distribution – P(A), P(A,B)
Example
P(Weather = Sunny) = 0.1
P(Weather = Rain) = 0.7
P(Weather = Snow) = 0.2
Weather – random variable
P(cavity=true)= 0.2 P(cavity=false)=0.2

• P(Weather) = (0.1, 0.7, 0.2) – probability dsitribution


• Conditional probability – posterior – once the agent has obtained some evidence B
for A - P(A|B)
• P(Cavity | Toothache) = 0.8
Definitions - cont
• Axioms of probability
• The measure of the occurrence of an event (random variable)
A – a function P:S  R satisfying the axioms:
• 0  P(A)  1
• P(S) = 1 ( or P(true) = 1 and P(false) = 0)
• P(A  B) = P(A) + P(B) - P(A  B)

• Above three axioms are also called as Kolmogorov’s axioms

P(A  ~A) = P(A)+P(~A) –P(false) = P(true)


P(~A) = 1 – P(A)
Product rule
Conditional probabilities can be defined in terms of
unconditional probabilities
The condition probability of the occurrence of A if
event B occurs
– P(A|B) = P(A  B) / P(B)
This can be written also as:
– P(A  B) = P(A|B) * P(B)
For probability distributions
– P(A=a1  B=b1) = P(A=a1|B=b1) * P(B=b1)
– P(A=a1  B=b2) = P(A=a1|B=b2) * P(B=b2) ….
– P(X,Y) = P(X|Y)*P(Y)
Bayes Theorem
hi – hypotheses (i=1,k);
e1,…,en - evidence
P(hi)
P(hi | e1,…,en)
P(e1,…,en| hi)
P(e1 ,e2 ,...,e n |h i )  P(h i )
P(h i |e1 ,e2 ,...,e n ) = k
, i = 1, k
 P(e1 ,e2 ,...,e n |h j )  P(h j )
j 1
Bayes’ Theorem - cont

• If e1,…,en are independent hypotheses then

P(e1 , e 2 ,..., e n | h j ) = P(e1 | h j )  P(e 2 | h j )  ...  P(e n | h j ), j = 1, k


Inferences
Probability distribution P(Cavity, Tooth)
Toothache  Toothache
Cavity 0.04 0.06
 Cavity 0.01 0.89

P(Cavity) = 0.04 + 0.06 = 0.1


P(Cavity  Tooth) = 0.04 + 0.01 + 0.06 = 0.11
P(A|B) = P(A  B) / P(B)
P(Cavity | Tooth) = P(Cavity  Tooth) / P(Tooth) = 0.04 /
0.05
Inferences

Probability distributions P(Cavity, Tooth, Catch)


P(Cavity) = 0.108 + 0.012 + 0.72 + 0.008 = 0.2
P(Cavity  Tooth) = 0.108 + 0.012 + 0.072 + 0.008 +
0.016
+ 0.064 = 0.28
P(Cavity | Tooth) = P(Cavity  Tooth) / P(Tooth) =
[P(Cavity  Tooth  Catch) + P(Cavity  Tooth  ~
Catch)] * / P(Tooth)
Bayesian networks
• Represent dependencies among random variables
• Give a short specification of conditional probability
distribution
• Many random variables are conditionally
independent
• Simplifies computations
• Graphical representation
• DAG – causal relationships among random variables
• Allows inferences based on the network structure
Definition of Bayesian networks
A BN is a DAG in which each node is annotated with
quantitative probability information, namely:
• Nodes represent random variables (discrete or
continuous)
• Directed links XY: X has a direct influence on Y, X is
said to be a parent of Y
• each node X has an associated conditional probability
table, P(Xi | Parents(Xi)) that quantify the effects of
the parents on the node

Example: Weather, Cavity, Toothache, Catch


• Weather, Cavity  Toothache, Cavity  Catch
Bayesian network - example

P(A|B) = P(A|B,E) *P(E|B) + P(A| B,E)*P(E|B)


= P(A|B,E) *P(E) + P(A| B,E)*P(E)
= 0.95 * 0.002 + 0.94 * 0.998 = 0.94002

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy