Chapter13 PDF
Chapter13 PDF
Chapter13 PDF
Chapter 13
Chapter 13
Outline
Uncertainty
Probability
Syntax and Semantics
Inference
Independence and Bayes Rule
Chapter 13
Uncertainty
Let action At = leave for airport t minutes before flight
Will At get me there on time?
Problems:
1) partial observability (road state, other drivers plans, etc.)
2) noisy sensors (KCBS traffic reports)
3) uncertainty in action outcomes (flat tire, etc.)
4) immense complexity of modelling and predicting traffic
Hence a purely logical approach either
1) risks falsehood: A25 will get me there on time
or 2) leads to conclusions that are too weak for decision making:
A25 will get me there on time if theres no accident on the bridge
and it doesnt rain and my tires remain intact etc etc.
(A1440 might reasonably be said to get me there on time
but Id have to stay overnight in the airport . . .)
Chapter 13
Probability
Probabilistic assertions summarize effects of
laziness: failure to enumerate exceptions, qualifications, etc.
ignorance: lack of relevant facts, initial conditions, etc.
Subjective or Bayesian probability:
Probabilities relate propositions to ones own state of knowledge
e.g., P (A25|no reported accidents) = 0.06
These are not claims of a probabilistic tendency in the current situation
(but might be learned from past experience of similar situations)
Probabilities of propositions change with new evidence:
e.g., P (A25|no reported accidents, 5 a.m.) = 0.15
(Analogous to logical entailment status KB |= , not truth.)
Chapter 13
gets
gets
gets
gets
me
me
me
me
there
there
there
there
on
on
on
on
time| . . .)
time| . . .)
time| . . .)
time| . . .)
=
=
=
=
0.04
0.70
0.95
0.9999
Chapter 13
Probability basics
Begin with a set the sample space
e.g., 6 possible rolls of a die.
is a sample point/possible world/atomic event
A probability space or probability model is a sample space
with an assignment P () for every s.t.
0 P () 1
P () = 1
e.g., P (1) = P (2) = P (3) = P (4) = P (5) = P (6) = 1/6.
An event A is any subset of
P (A) = {A}P ()
E.g., P (die roll < 4) = P (1) + P (2) + P (3) = 1/6 + 1/6 + 1/6 = 1/2
Chapter 13
Random variables
A random variable is a function from sample points to some range, e.g., the
reals or Booleans
e.g., Odd(1) = true.
P induces a probability distribution for any r.v. X:
P (X = xi) = {:X() = xi}P ()
e.g., P (Odd = true) = P (1) + P (3) + P (5) = 1/6 + 1/6 + 1/6 = 1/2
Chapter 13
Propositions
Think of a proposition as the event (set of sample points)
where the proposition is true
Given Boolean random variables A and B:
event a = set of sample points where A() = true
event a = set of sample points where A() = f alse
event a b = points where A() = true and B() = true
Often in AI applications, the sample points are defined
by the values of a set of random variables, i.e., the
sample space is the Cartesian product of the ranges of the variables
With Boolean variables, sample point = propositional logic model
e.g., A = true, B = f alse, or a b.
Proposition = disjunction of atomic events in which it is true
e.g., (a b) (a b) (a b) (a b)
P (a b) = P (a b) + P (a b) + P (a b)
Chapter 13
>
Chapter 13
10
Chapter 13
11
Prior probability
Prior or unconditional probabilities of propositions
e.g., P (Cavity = true) = 0.1 and P (W eather = sunny) = 0.72
correspond to belief prior to arrival of any (new) evidence
Probability distribution gives values for all possible assignments:
P(W eather) = h0.72, 0.1, 0.08, 0.1i (normalized, i.e., sums to 1)
Joint probability distribution for a set of r.v.s gives the
probability of every atomic event on those r.v.s (i.e., every sample point)
P(W eather, Cavity) = a 4 2 matrix of values:
W eather = sunny rain cloudy snow
Cavity = true 0.144 0.02 0.016 0.02
Cavity = f alse 0.576 0.08 0.064 0.08
Every question about a domain can be answered by the joint
distribution because every event is a sum of sample points
Chapter 13
12
0.125
18
dx
26
dx0
Chapter 13
13
P (x) =
Gaussian density
2
2
1 e(x) /2
2
Chapter 13
14
Conditional probability
Conditional or posterior probabilities
e.g., P (cavity|toothache) = 0.8
i.e., given that toothache is all I know
NOT if toothache then 80% chance of cavity
(Notation for conditional distributions:
P(Cavity|T oothache) = 2-element vector of 2-element vectors)
If we know more, e.g., cavity is also given, then we have
P (cavity|toothache, cavity) = 1
Note: the less specific belief remains valid after more evidence arrives,
but is not always useful
New evidence may be irrelevant, allowing simplification, e.g.,
P (cavity|toothache, 49ersW in) = P (cavity|toothache) = 0.8
This kind of inference, sanctioned by domain knowledge, is crucial
Chapter 13
15
Conditional probability
Definition of conditional probability:
P (a|b) =
P (a b)
if P (b) 6= 0
P (b)
16
Inference by enumeration
Start with the joint distribution:
L
toothache
catch
toothache
catch catch
catch
.144 .576
.016 .064
cavity
.072 .008
.108 .012
cavity
Chapter 13
17
Inference by enumeration
Start with the joint distribution:
L
toothache
catch
toothache
catch catch
catch
.144 .576
.016 .064
cavity
.072 .008
.108 .012
cavity
Chapter 13
18
Inference by enumeration
Start with the joint distribution:
L
toothache
catch
toothache
catch catch
catch
.144 .576
.016 .064
cavity
.072 .008
.108 .012
cavity
Chapter 13
19
Inference by enumeration
Start with the joint distribution:
L
toothache
catch
toothache
catch catch
catch
.144 .576
.016 .064
cavity
.072 .008
.108 .012
cavity
Chapter 13
20
Normalization
L
toothache
catch
toothache
catch catch
catch
.144 .576
.016 .064
cavity
.072 .008
.108 .012
cavity
21
22
Independence
A and B are independent iff
P(A|B) = P(A) or P(B|A) = P(B) or P(A, B) = P(A)P(B)
Cavity
Toothache
Catch
decomposes into
Weather
Cavity
Toothache Catch
Weather
Chapter 13
23
Conditional independence
P(T oothache, Cavity, Catch) has 23 1 = 7 independent entries
If I have a cavity, the probability that the probe catches in it doesnt depend
on whether I have a toothache:
(1) P (catch|toothache, cavity) = P (catch|cavity)
The same independence holds if I havent got a cavity:
(2) P (catch|toothache, cavity) = P (catch|cavity)
Catch is conditionally independent of T oothache given Cavity:
P(Catch|T oothache, Cavity) = P(Catch|Cavity)
Equivalent statements:
P(T oothache|Catch, Cavity) = P(T oothache|Cavity)
P(T oothache, Catch|Cavity) = P(T oothache|Cavity)P(Catch|Cavity)
Chapter 13
24
Chapter 13
25
Bayes Rule
Product rule P (a b) = P (a|b)P (b) = P (b|a)P (a)
P (b|a)P (a)
Bayes rule P (a|b) =
P (b)
or in distribution form
P(X|Y )P(Y )
= P(X|Y )P(Y )
P(Y |X) =
P(X)
Useful for assessing diagnostic probability from causal probability:
P (Ef f ect|Cause)P (Cause)
P (Cause|Ef f ect) =
P (Ef f ect)
E.g., let M be meningitis, S be stiff neck:
P (s|m)P (m) 0.8 0.0001
=
= 0.0008
P (m|s) =
P (s)
0.1
Note: posterior probability of meningitis still very small!
Chapter 13
26
P(Cavity|toothache catch)
= P(toothache catch|Cavity)P(Cavity)
= P(toothache|Cavity)P(catch|Cavity)P(Cavity)
This is an example of a naive Bayes model:
P(Cause, Ef f ect1, . . . , Ef f ectn) = P(Cause)iP(Ef f ecti|Cause)
Cavity
Toothache
Catch
Cause
Effect 1
Effect n
Chapter 13
27
Wumpus World
2,2
1,2
2,3
1,3
2,4
1,4
4,3
3,3
4,4
3,4
3,2
4,2
B
OK
1,1
2,1
3,1
4,1
B
OK
OK
Chapter 13
28
Chapter 13
29
Chapter 13
30
2,3
1,3
2,4
1,4
4,3
3,3
4,4
3,4
OTHER
QUERY
2,1
1,1
2,2
1,2
KNOWN
3,2
FRINGE
3,1
4,2
4,1
31
P(P1,3|known, b)
=
unknown
X
=
=
=
unknown
f ringe other
X
f ringe other
f ringe
f ringe
f ringe
f ringe
other
P (known)P(P1,3)
= 0
P(P1,3)
other
other
P (other
Chapter 13
32
1,3
1,2
2,2
1,2
OK
OK
1,1
2,1
3,1
1,1
2,2
1,3
1,2
2,2
3,1
1,1
OK
1,3
2,2
3,1
1,1
OK
1,3
2,2
B
OK
2,1
OK
1,2
B
OK
2,1
OK
1,2
B
OK
2,1
B
OK
3,1
1,1
2,1
OK
OK
3,1
B
OK
OK
OK
Chapter 13
33
Summary
Probability is a rigorous formalism for uncertain knowledge
Joint probability distribution specifies probability of every atomic event
Queries can be answered by summing over atomic events
For nontrivial domains, we must find a way to reduce the joint size
Independence and conditional independence provide the tools
Chapter 13
34