Acting Under Uncertainty Basic Concepts
Acting Under Uncertainty Basic Concepts
Acting Under Uncertainty Basic Concepts
Why bother?
Two examples
Bayes’ rule
Bayes’ nets
e.g a coin. Let’s use Throw as the random variable denoting the
outcome when we toss the coin.
The set of possible outcomes for a random variable is called its domain.
e.g. if our world consists of only two Boolean random variables, then the
world has a four possible atomic events
P (a b) P (a ) P (b) P (a b)
P (a b) P (a ) P (b) P (a b)
So conditional probabilities reflect the fact that some events make other
events more (or less) likely
If one event doesn’t affect the likelihood of another event they are said
to be independent and therefore
P ( a | b) P ( a )
E.g. if you roll a 6 on a die, it doesn’t make it more or less likely that you
will roll a 6 on the next throw. The rolls are independent.
AI Principles, Lecture on Reasoning Under Uncertainty
Combining Probabilities: the product rule
How we can work out the likelihood of two events occuring
together given their base and conditional probabilities?
P (a b) P (a | b) P (b) P (b | a ) P (a )
P (a b) P (a | b) P (b) P (b | a ) P (a )
P (a | b) P (b) P (b | a ) P (a )
P (b | a ) P (a )
P ( a | b)
P (b)
This is known as Bayes’ rule.
If we model how likely observable effects are given hidden causes (how
likely toothache is given a cavity)
Then Bayes’ rule allows us to use that model to infer the likelihood of the
hidden cause (and thus answer our question)
P ( s | m) 0.5
She also knows that the probability in the general population of someone having a
stiff neck at any time is 1/20
Using Bayes’ rule she can calculate the probability the patient has meningitis:
P (m) 0.00002
But sometimes it’s harder to find out P(effect|cause) for all causes
independently than it is simply to find out P(effect)
Note that Bayes’ rule here relies on the fact the effect must have arisen
because of one of the hypothesised causes. You can’t reason directly
about causes you haven’t imagined.
AI Principles, Lecture on Reasoning Under Uncertainty
Bayes’ rule: combining evidence
Suppose we have several pieces of evidence we want to
combine:
• John rings and Mary rings
• I have toothache and the dental probe catches on my tooth
How do we do this?
P (cavity | toothache catch) P (toothache catch | cavity) P (cavity)
Toothache and catch are not independent, but they are independent
given the presence or absence of a cavity.
In other words we can use the knowledge that cavities cause toothache
and they cause the catch, but the catch and the toothache do not cause
each other (they have a single common cause).
AI Principles, Lecture on Reasoning Under Uncertainty
Bayes’ nets
This can be captured in a picture, where the arcs capture conditional
independence relationships
Or in a new equation:
Some kinds of
inference don’t seem to
be obviously
explainable using
probabilistic reasoning
alone
It is not the case that statistical methods are the only way