What Is The Bayes' Theorem?

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 12
At a glance
Powered by AI
Bayes' Theorem is a mathematical formula used to revise probabilities based on new information. It allows determining the conditional probability of events based on prior knowledge.

Bayes' Theorem describes the probability of an event based on prior knowledge of conditions that might be relevant. It is used to revise previously calculated probabilities when new information becomes available.

Advantages include a natural way to combine prior information with new data and providing exact inferences without relying on asymptotic approximations. Disadvantages include difficulty in selecting priors and high computational costs for complex models.

What is the Bayes’ Theorem?

In statistics and probability theory, the Bayes’ theorem (also known


as the Bayes’ rule) is a mathematical formula used to determine the
conditional probability of events. Essentially, the Bayes’ theorem
describes the probability of an event based on prior knowledge of
the conditions that might be relevant to the event.

 Bayes’ Theorem is used to revise previously


calculated probabilities based on new
information.

 Developed by Thomas Bayes in the 18th Century.

 It is an extension of conditional probability


Bayes’ Theorem
P(A | B i )P(B i )
P(B i | A) 
P(A | B 1 )P(B 1 )  P(A | B 2 )P(B 2 )      P(A | B k )P(B k )
Proof of Bayes Theorem
The probability of two events A and B happening, P(A∩B), is the probability of A, P(A), times the
probability of B given that A has occurred, P(B|A).
P(A ∩ B) = P(A)P(B|A) (1)

On the other hand, the probability of A and B is also equal to the probability of B times the
probability of A given B.
P(A ∩ B) = P(B)P(A|B) (2)

Equating the two yields:


P(B)P(A|B) = P(A)P(B|A) (3)

and thus

P(A|B) = P(A)P(B|A) / P(B) (4)

This equation, known as Bayes Theorem is the basis of statistical inference.


Applications of Bayes Theorem
• In Computer Science Bayes Theorem is used in enhancing low resolution imaging and in
filtering situations such as spam and noise filters…all situations involving “probability based
decision making.”
• Bayes’ theorem is helpful in many fields like management, bio-chemistry, business, predict
best among the groups and many more.

• In bio-chemistry deciding the diseases based on various blood sample tests. Infact
those results are based on probability

• For project managers : All project managers want to know whether the projects they’re
working on will finish on time. So, as our example, we’ll assume that a project manager
asks the question: what’s the probability that my project will finish on time? There are
only two possibilities here: either the project finishes on (or before) time or it doesn’t on
the basis of various factors like tools, number of workers, efficiency of workers etc
Advantages
• Bayesian methods and classical methods both have advantages and disadvantages, and
there are some similarities. When the sample size is large, Bayesian inference often
provides results for parametric models that are very similar to the results produced by
frequents methods.
• It provides a natural and principled way of combining prior information with data, within a
solid decision theoretical framework. You can incorporate past information about a
parameter and form a prior distribution for future analysis. When new observations
become available, the previous posterior distribution can be used as a prior. All inferences
logically follow from Bayes’ theorem.

• It provides inferences that are conditional on the data and are exact, without reliance
on asymptotic approximation. Small sample inference proceeds in the same manner as
if one had a large sample.

• It provides interpretable answers, such as “the true parameter has a probability of 0.95
of falling in a 95% credible interval.”
Disadvantages

• It does not tell you how to select a prior. There is no correct way to choose a prior.
Bayesian inferences require skills to translate subjective prior beliefs into a mathematically
formulated prior. If you do not proceed with caution, you can generate misleading results.

• It can produce posterior distributions that are heavily influenced by the priors. From a
practical point of view, it might sometimes be difficult to convince subject matter
experts who do not agree with the validity of the chosen prior

• It often comes with a high computational cost, especially in models with a large
number of parameters.
Bayes’ Theorem Example
 A drilling company has estimated a 40%
chance of striking oil for their new well.
 A detailed test has been scheduled for more
information. Historically, 60% of successful
wells have had detailed tests, and 20% of
unsuccessful wells have had detailed tests.
 Given that this well has been scheduled for a
detailed test, what is the probability
that the well will be successful?
Apply Bayes’ Theorem:

 Let S = successful well


U = unsuccessful well
 P(S) = 0.4 , P(U) = 0.6 (prior probabilities)
 Define the detailed test event as D
 Conditional probabilities:
P(D|S) = 0.6 P(D|U) = 0.2
 Goal is to find P(S|D)
Apply Bayes’ Theorem:

So the revised probability of success, given that this well


has been scheduled for a detailed test, is 0.667
Bayes’ Theorem Example-2
 M& R Electronic is considering marketing a
new model of television. In the past 40% of
new model televisions have been successful
and 60% have been failure. Before introducing
a the new model the marketing team conducts
an extensive study and releases the report. In
the past, 80% successful new models received
a favorable report and 30% of the unsuccessful
models received unsuccessful reports. For the
new model, the marketing team has issued a
favorable report, what is probability that it will
be successful.
Apply Bayes’ Theorem:

 P ( Model successful ) = 0.4, P ( not successful)


= 06.
 P ( report favorable | model successful) =0.8
 P ( report favorable | model not successful) =0.3
 P( model successful | report favorable) =
P( RF| MS) P( MS)/ ( P( RF| MS) P( MS) +P(RF|MNS) P( MNS) )

= (0.8) ( 0.4)/ ( (0.8) )(0.4) + (0.3)(0.6) = 0.32/ ( 0.32 + 0.18) = 0.64


THANK YOU

Submitted By

Aryaman Tewari

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy