Uncertain Knowledge

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 31

1

UNIT – 5
Knowledge Facets and Logic Inferences
Uncertain Knowledge 2

• Uncertain knowledge in artificial intelligence refers to


information that is not definitive or fully reliable due to
various factors such as incomplete data, ambiguity, or
inherent unpredictability. It poses significant challenges in
reasoning, decision-making, and knowledge representation.
3
• Types of Uncertainty
1.Statistical Uncertainty:
1. Arises from randomness or variability in data. For example, predicting the weather
involves uncertainty due to changing atmospheric conditions.
2.Ambiguity:
1. Occurs when information can be interpreted in multiple ways. Natural language
processing often deals with ambiguous phrases that can have different meanings
based on context.
3.Incomplete Information:
1. Happens when not all relevant data is available. For instance, a medical diagnosis
may be challenging without complete patient history or test results.
4.Subjective Uncertainty:
1. Relates to personal beliefs or opinions. For example, expert opinions on a specific
treatment can vary widely.
Probabilistic Reasoning- connection to
logic- independence 4

• Probabilistic reasoning is a key aspect of artificial


intelligence that combines principles from probability
theory with logical reasoning. It allows systems to make
inferences based on uncertain information, which is
particularly useful in real-world applications where data
may be incomplete or ambiguous.
5

• Connection to Logic
1.Logical Foundations:
1. Traditional logic (e.g., propositional logic, predicate logic) deals with
binary truths (true or false). In contrast, probabilistic reasoning
assigns probabilities to statements, allowing for degrees of belief. For
instance, instead of saying a patient "is sick" or "is not sick,"
probabilistic reasoning might quantify the belief as 70% likely to be
sick.
6
7
• Independence
• Independence is a critical concept in probabilistic reasoning,
impacting how we understand and manipulate probabilities.
8
Bayes rule 9

• Bayes' Rule, or Bayes' Theorem, is a fundamental principle


in probability theory that describes how to update the
probability of a hypothesis based on new evidence. It
provides a way to revise beliefs when given new data and
is widely used in statistics, machine learning, and artificial
intelligence.
10
11
12
Bayesian networks 13
• Bayesian belief network is key computer technology for dealing
with probabilistic events and to solve a problem which has
uncertainty. We can define a Bayesian network as:
• "A Bayesian network is a probabilistic graphical model which
represents a set of variables and their conditional
dependencies using a directed acyclic graph."
• It is also called a Bayes network, belief network, decision
network, or Bayesian model.
• Bayesian networks are probabilistic, because these networks
are built from a probability distribution, and also use
probability theory for prediction and anomaly detection.
14
• Real world applications are probabilistic in nature, and to
represent the relationship between multiple events, we need a
Bayesian network. It can also be used in various tasks including
prediction, anomaly detection, diagnostics, automated insight,
reasoning, time series prediction, and decision making under
uncertainty.
• Bayesian Network can be used for building models from data
and experts opinions, and it consists of two parts:
• Directed Acyclic Graph
• Table of conditional probabilities.
• The generalized form of Bayesian network that represents and
solve decision problems under uncertain knowledge is known as
an Influence diagram.
15
16
• Each node corresponds to the random variables, and a variable can
be continuous or discrete.
• Arc or directed arrows represent the causal relationship or
conditional probabilities between random variables. These directed
links or arrows connect the pair of nodes in the graph.
These links represent that one node directly influence the other
node, and if there is no directed link that means that nodes are
independent with each other
• In the above diagram, A, B, C, and D are random variables
represented by the nodes of the network graph.
• If we are considering node B, which is connected with node A by a
directed arrow, then node A is called the parent of Node B.
• Node C is independent of node A.
17
• The Bayesian network has mainly two components:
• Causal Component
• Actual numbers
• Each node in the Bayesian network has condition
probability distribution P(Xi |Parent(Xi) ), which
determines the effect of the parent on that node.
• Bayesian network is based on Joint probability distribution
and conditional probability. So let's first understand the joint
probability distribution: 18
• Joint probability distribution:
• If we have variables x1, x2, x3,....., xn, then the probabilities
of a different combination of x1, x2, x3.. xn, are known as
Joint probability distribution.
Explanation of Bayesian network: 19
• Let's understand the Bayesian network through an example by creating a
directed acyclic graph:
• Example: Harry installed a new burglar alarm at his home to detect burglary.
The alarm reliably responds at detecting a burglary but also responds for minor
earthquakes. Harry has two neighbors David and Sophia, who have taken a
responsibility to inform Harry at work when they hear the alarm. David always
calls Harry when he hears the alarm, but sometimes he got confused with the
phone ringing and calls at that time too. On the other hand, Sophia likes to
listen to high music, so sometimes she misses to hear the alarm. Here we
would like to compute the probability of Burglary Alarm.
• Problem:
• Calculate the probability that alarm has sounded, but there is neither a
burglary, nor an earthquake occurred, and David and Sophia both called the
Harry.
20
• Solution:
• The Bayesian network for the above problem is given below. The network
structure is showing that burglary and earthquake is the parent node of the
alarm and directly affecting the probability of alarm's going off, but David and
Sophia's calls depend on alarm probability.
• The network is representing that our assumptions do not directly perceive the
burglary and also do not notice the minor earthquake, and they also not confer
before calling.
• The conditional distributions for each node are given as conditional probabilities
table or CPT.
• Each row in the CPT must be sum to 1 because all the entries in the table
represent an exhaustive set of cases for the variable.
• In CPT, a boolean variable with k boolean parents contains 2K probabilities.
Hence, if there are two parents, then CPT will contain 4 probability values
21

• List of all events occurring in this network:

• Burglary (B)
• Earthquake(E)
• Alarm(A)
• David Calls(D)
• Sophia calls(S)
22
• We can write the events of problem statement in the form of
probability: P[D, S, A, B, E], can rewrite the above probability
statement using joint probability distribution:
• P[D, S, A, B, E]= P[D | S, A, B, E]. P[S, A, B, E]
• =P[D | S, A, B, E]. P[S | A, B, E]. P[A, B, E]
• = P [D| A]. P [ S| A, B, E]. P[ A, B, E]
• = P[D | A]. P[ S | A]. P[A| B, E]. P[B, E]
• = P[D | A ]. P[S | A]. P[A| B, E]. P[B |E]. P[E]
23
24
• Let's take the observed probability for the Burglary and earthquake component:

• P(B= True) = 0.002, which is the probability of burglary.

• P(B= False)= 0.998, which is the probability of no burglary.

• P(E= True)= 0.001, which is the probability of a minor earthquake

• P(E= False)= 0.999, Which is the probability that an earthquake not occurred.

• We can provide the conditional probabilities as per the below tables:


25

• Conditional probability table for Alarm A:


• The Conditional probability of Alarm A depends on Burglar
and earthquake:
B E P(A= True) P(A= False)
True True 0.94 0.06
True False 0.95 0.04
False True 0.31 0.69
False False 0.001 0.999
26

• Conditional probability table for David Calls:


• The Conditional probability of David that he will call
depends on the probability of Alarm.

A P(D= True) P(D= False)


True 0.91 0.09
False 0.05 0.95
27

• Conditional probability table for Sophia Calls:


• The Conditional probability of Sophia that she calls is
depending on its Parent Node "Alarm.“
A P(S= True) P(S= False)
True 0.75 0.25
False 0.02 0.98

• From the formula of joint distribution, we can write the


problem statement in the form of probability distribution:
28

• P(S, D, A, ¬B, ¬E) = P (S|A) *P (D|A)*P (A|¬B ^ ¬E)


*P (¬B) *P (¬E).
• = 0.75* 0.91* 0.001* 0.998*0.999
• = 0.00068045.
• Hence, a Bayesian network can answer any query
about the domain by using Joint distribution.
Probabilistic inference 29

• Definition:
• Probabilistic inference is the process of drawing conclusions
about uncertain situations using probability theory. It
involves updating the probability of a hypothesis or event
based on new evidence, typically using Bayes' Theorem.
30
31

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy