UNIT-4 Uncertainty in Artificial Intelligence
UNIT-4 Uncertainty in Artificial Intelligence
UNIT-4 Uncertainty in Artificial Intelligence
(A1440
might reasonably be said to get me
there on time but I'd have to stay
overnight in the airport …)
Making decisions under uncertainty
Suppose I believe the following:
P(A25 gets me there on time | …) = 0.04
P(A90 gets me there on time | …) = 0.70
P(A120 gets me there on time | …) = 0.95
P(A1440 gets me there on time | …) = 0.9999
∀p Symptom(P,Toothache)Disease(p,Cavity) V
Disease(p,Gum disease) V
Disease(p,Abscess)
∀p Disease(p,Cavity) Symptom(P,Toothache)
But this rule is not right either; not painall cavities cause
we have a rule:
if toothache then problem is cavity
But not all patients have toothaches due to cavities
so we could set up rules like:
if toothache and not(gum disease) and not(filling)
and ... then problem = cavity
This gets complicated, a better method would be:
if toothache then problem is cavity with 0.8
probability or P(cavity|toothache) = 0.8
the probability of cavity is 0.8 given toothache is all
that is known
Probability basics
•
◦ e.g., Weather is one of
<sunny,rainy,cloudy,snow>
Axioms of probability
For any propositions A, B
i)All probabilities are between 0 and 1
0 ≤ P(A) ≤ 1
ii)P(true) = 1, P(false) = 0
probability of 1 for propositions believed to be absolutely
true
probability of 0 for propositions believed to be absolutely
false
iii)Theprobability of disjunction is given
P(A B) = P(A) + P(B) - P(A B)
◦
Prior probability
• Prior or unconditional probabilities of propositions
For any proposition φ, sum the atomic events where it is true: P(φ) =
Σω:ω╞φ P(ω)
Q:what is the prb. If Mr.Bheem having toothache?
P(toothache)=0.108+.012+0.16+.064
=.20 or 20%
Inference by enumeration
• Start with the joint probability distribution:
= P(cavity toothache)
P(toothache)
= 0.016+0.064
0.108 + 0.012 + 0.016 + 0.064
= 0.4
◦
Conditional probability
Definition of conditional probability:
P(a | b) = P(a b) / P(b) if P(b) > 0
◦
◦
Independence
A and B are independent iff
P(A|B) = P(A) or P(B|A) = P(B) or P(A, B) = P(A) P(B)
Bayes' Rule
P(ab) = P(a | b) P(b) = P(b | a) P(a)
Bayes' rule: P(a | b) = P(b | a) P(a) / P(b)
Useful for assessing diagnostic probability from causal probability
◦ P(Cause|Effect) = P(Effect|Cause) P(Cause) / P(Effect)
◦
◦
Bayesian networks
A simple, graphical notation for conditional independence
assertions and hence for compact specification of full joint
distributions
Syntax:
◦ a set of nodes, one per variable
◦ a directed, acyclic graph (link ≈ "directly influences")
◦ a conditional distribution for each node given its parents:
P (Xi | Parents (Xi))
P(J | M) = P(J)?
Example
Suppose we choose the ordering M, J, A, B, E
P(J | M) = P(J)?
P(A | J, M) = P(A | J)? P(A | J, M) = P(A)?
No
Example
• Suppose we choose the ordering M, J, A, B, E
P(J | M) = P(J)?
P(A | J, M) = P(A | J)? P(A | J, M) = P(A)? No
P(B | A, J, M) = P(B | A)?
P(B | A, J, M) = P(B)?
• No
Example
• Suppose we choose the ordering M, J, A, B, E
P(J | M) = P(J)?
P(A | J, M) = P(A | J)? P(A | J, M) = P(A)? No
P(B | A, J, M) = P(B | A)? Yes
P(B | A, J, M) = P(B)? No
P(E | B, A ,J, M) = P(E | A)?
P(E | B, A, J, M) = P(E | A, B)?
• No
Example
• Suppose we choose the ordering M, J, A, B, E
P(J | M) = P(J)?
P(A | J, M) = P(A | J)? P(A | J, M) = P(A)? No
P(B | A, J, M) = P(B | A)? Yes
P(B | A, J, M) = P(B)? No
P(E | B, A ,J, M) = P(E | A)? No
P(E | B, A, J, M) = P(E | A, B)? Yes
• No
Example contd.