Unit-4 Uncertainity
Unit-4 Uncertainity
Unit-4 Uncertainity
y
01 Uncertainty
04 Baye’s Rules
05 Joint distribution
07 Bayesian Network
Agents almost never have access to the whole truth
about their environment. Agents must, therefore, act
under uncertainty.
Acting Under
uncertainty
Handing Uncertain Knowledge
Laziness:
Theoretical ignorance:
Practical ignorance:
particular patient because not all the necessary tests have been
(1)
(2)
(5)
Important Derivation from Axioms of
Probability
Random
Variables
• Random Variables: Random variables are real valued
function whose domain is sample space of random experiment.
• The random variables are typically divided into three
kinds.
• Boolean random variables: such as Cavity, have the
domain (true, false).
• We will often abbreviate a proposition such as Cavity = true
simply by the lowercase name cavity. Similarly, Cavity = false
would be abbreviated by ¬cavity.
• Discrete Random Variables: A discrete variable is
variable which takes finite number of distinct values.
• Suppose you are going to the dentist for a regular checkup, the
probability P(cavity)=0.2 might be of interest.
• you go to the dentist because you have a toothache, it’s P(cavity |
toothache)=0.8 that matters
• When making decisions, an agent needs to condition on all the
evidence it has observed.
• It is also important to understand the difference between
conditioning and logical implication.
• The assertion that P(cavity | toothache)=0.8 does not mean
“Whenever toothache is true, conclude that cavity is true with
probability 0.6” rather it means “Whenever toothache is true and
we have no further information, conclude that cavity is true with
probability 0.6.”
•
Joint Probability Distribution
Probability distribution.
Baye’s Rules
The more general case of Bayes’ rule for multivalued variables can be
written in the P notation as follows:
Inference From Joint Probability
Marginalization
• Start with the joint probability distribution:
P(cavity | toothache) =
=P(toothache)
0.016+0.064
0.108 + 0.012 + 0.016 + 0.064
= 0.4
P(cavity | toothache)
=
P(toothache)
0.016+0.064
0.108 + 0.012 + 0.016 +
0.064
= 0.4
Inference From Joint Probability
= α, [<0.108,0.016> + <0.012,0.064>]
= α, <0.12,0.08> = <0.6,0.4>
Independence
• A and B are independent iff
• P(A|B) = P(A) or P(B|A) = P(B) or P(A, B) = P(A) P(B)
Let us take an example an understand this, For variables X,Y,Z, A,B if they
are independent
P(X,Y,Z,A,B)=P(X) *P(Y) *P(Z) *P(A) *P(B)…………………………………(A)
[Equation no.]
If they are dependent then we have to see the dependency for example
X,Y, and Z depends on A but not on B. Then the Formula becomes as
follows
P(X,Y,Z,A,B)=P(X|A) *P(Y|A) *P(Z|A) *P(A) *P(B)……………………(B)
[Equation no.]
If X,Y and Z depends on both A and B then formula (1) will be given as
Belief or Bayesian network:
Semantics
• ccccccccccc
(1)
This identity is called the chain rule. It holds for any set of random
variables. Comparing it with Equation (1), we see that the specification of
the joint distribution is equivalent to the general assertion that, for every
variable Xi in the network,
(2)
Belief or Bayesian network:
Semantics
A method for constructing Bayesian
networks