Bayes Rule, Bayesian Model

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

1.

Bayes Rule
 One of the most important rules for AI usage
 Derived from the chain rule.
o P(A | B) = P(A | B)P(B) = P(B | A)P(A)
 With these power rule, the bayes theorem constructed as

P(A|B)=

 Here P(B|A) is probability of evidence given the hypothesis


 P(A) is the prior probability of hypothesis
 P(B) is the prior probability of evidence
 P(A|B) is the probability of hypothesis A conditional on a new piece of
evidence B

P(Cause | Effect) =

Bayes rule allows unknown probabilities, to be computed from known conditional


probabilities, usually in the very causal direction

Example
Hypothesis for Flu based on symptoms
Given
P(A)=Symptom of Flu=0.00001
P(B|A)=Probability of Symptoms gives Flu=0.95
P(B)=Your symptom of Flu=0.01(headache or running nose,1 in 100)

Compute P(A|B)
P(A|B)=(0.95*0.00001)/0.01
=0.00095 (less than one in thousand)

2. Bayes' theorem:
 Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning,
which determines the probability of an event with uncertain knowledge.
 In probability theory, it relates the conditional probability and marginal
probabilities of two random events.
 Bayes' theorem was named after the British mathematician Thomas Bayes.
The Bayesian inference is an application of Bayes' theorem, which is fundamental
to Bayesian statistics.
 It is a way to calculate the value of P(B|A) with the knowledge of P(A|B).
 Bayes' theorem allows updating the probability prediction of an event by observing
new information of the real world.

Example: If cancer corresponds to one's age then by using Bayes' theorem, we can
determine the probability of cancer more accurately with the help of age.

Bayes' theorem can be derived using product rule and conditional probability of event A
with known event B:

As from product rule we can write:

P(A ⋀ B)= P(A|B) P(B) or

Similarly, the probability of event B with known event A:

P(A ⋀ B)= P(B|A) P(A)

Equating right hand side of both the equations, we will get:

The above equation (a) is called as Bayes' rule or Bayes' theorem. This equation is basic
of most modern AI systems for probabilistic inference.

It shows the simple relationship between joint and conditional probabilities. Here,

P(A|B) is known as posterior, which we need to calculate, and it will be read as Probability
of hypothesis A when we have occurred an evidence B.

P(B|A) is called the likelihood, in which we consider that hypothesis is true, then we
calculate the probability of evidence.

P(A) is called the prior probability, probability of hypothesis before considering the
evidence

P(B) is called marginal probability, pure probability of an evidence.

In the equation (a), in general, we can write P (B) = P(A)*P(B|Ai), hence the Bayes' rule can
be written as:

Where A1, A2, A3,........, An is a set of mutually exclusive and exhaustive events.
Applying Bayes' rule:
Bayes' rule allows us to compute the single term P(B|A) in terms of P(A|B), P(B), and P(A).
This is very useful in cases where we have a good probability of these three terms and want
to determine the fourth one. Suppose we want to perceive the effect of some unknown
cause, and want to compute that cause, then the Bayes' rule becomes:

P(Cause | Effect) =

Example-1:

Question: what is the probability that a patient has diseases meningitis with a stiff
neck?

Given Data:

A doctor is aware that disease meningitis causes a patient to have a stiff neck, and it occurs
80% of the time. He is also aware of some more facts, which are given as follows:

o The Known probability that a patient has meningitis disease is 1/30,000.


o The Known probability that a patient has a stiff neck is 2%.

Let a be the proposition that patient has stiff neck and b be the proposition that patient has
meningitis. , so we can calculate the following as:

P(a|b) = 0.8

P(b) = 1/30000

P(a)= .02

Application of Bayes' theorem in Artificial intelligence:


Following are some applications of Bayes' theorem:

o It is used to calculate the next step of the robot when the already executed step is
given.
o Bayes' theorem is helpful in weather forecasting.
o It can solve the Monty Hall problem.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy