AI_Unit5_OPpdf__2024_10_23_09_39_12
AI_Unit5_OPpdf__2024_10_23_09_39_12
AI_Unit5_OPpdf__2024_10_23_09_39_12
3170716
Unit-5:
Probabilistic Reasoning
1. Symbolic Reasoning : Symbolic logic deals with how symbols relate to each other. It assigns
symbols to verbal reasoning in order to be able to check the validity of the statements through
a mathematical process.
Propositions:
A : All spiders have eight legs.
B : Black widows are a type of spider.
C : Black widows have eight legs.
The Ʌ means “and,” and the ⇒ symbol means “implies.”
Conclusion: A Ʌ B ⇒ C
Symbolic Reasoning
The reasoning is said to be symbolic when it can be performed by means of primitive
operations manipulating elementary symbols.
Usually, symbolic reasoning refers to mathematical logic, more precisely first-order (predicate)
logic and sometimes higher orders.
Statistical Reasoning
In the logic based approaches described, we have assumed that everything is either believed
false or believed true.
However, it is often useful to represent the fact that we believe, something is probably true, or
true with probability 0.65.
This is useful for dealing with problems where there is randomness and unpredictability (such
as in games of chance) and also for dealing with problems where we could, if we had sufficient
information, work out exactly what is true.
To do all this in a principled way requires techniques for probabilistic reasoning.
Probability quantifies the uncertainty of the outcomes of a random variable / event.
Real world applications are probabilistic in nature, and to represent the relationship between
multiple events, we need a Bayesian network.
Review of Probability Theory
Marginal probability is the probability of an event, irrespective of other random variables.
Marginal Probability: The probability of an event irrespective of the outcomes of other random variables, e.g.
P(A).
The joint probability is the probability of two (or more) simultaneous events, often described in
terms of events A and B from two dependent random variables, e.g. X and Y. The joint
probability is often summarized as just the outcomes, e.g. A and B.
Joint Probability: Probability of two (or more) simultaneous events, e.g. P(A and B) or P(A, B).
The conditional probability is the probability of one event given the occurrence of another
event, often described in terms of events A and B from two dependent random variables e.g. X
and Y.
Conditional Probability: Probability of one (or more) event given the occurrence of another event, e.g. P(A
given B) or P(A | B).
Review of Probability Theory
The joint probability can be calculated using the conditional probability :
σ𝐗∩𝐘=𝐙 𝐦1 𝐗 ∙ 𝐦2 𝐘
𝐦3 𝐙 =
1 − σ𝐗∩𝐘=∅ 𝐦1 𝐗 ∙ 𝐦2 𝐘
Dempster – Shafer Theory
For example, suppose m1 corresponds to our belief after observing fever:
m1= { F, C, P} = 0.6 and θ = (0.4)
suppose m2 corresponds to our belief after observing runny nose:
m2= { A, F, C} =0.8 and θ = (0.2)
Then we can compute their combination m3 using the following table.
{A,F,C } (0.8) Θ (0.2)
{F,C,P } (0.6) {F,C } (0.48) {F,C,P} (0.12)
Θ (0.4) {A,F,C } (0.32) Θ (0.08)