0% found this document useful (0 votes)
21 views

3-Bayesian Modelling - Inference and Bayesian NT

Okay, based on the weather data provided, here are the steps to calculate the posterior probability of PlayTennis = No given Outlook = Sunny and Temperature = Mild: 1) Define the events: A = PlayTennis = No B = Outlook = Sunny, Temperature = Mild 2) Count the number of cases where each event occurs: P(A) = Number of cases where PlayTennis = No / Total number of cases = 3/14 P(B) = Number of cases where Outlook = Sunny and Temperature = Mild / Total number of cases = 2/14 3) Count the number of cases where both events occur: P

Uploaded by

lilen74955
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

3-Bayesian Modelling - Inference and Bayesian NT

Okay, based on the weather data provided, here are the steps to calculate the posterior probability of PlayTennis = No given Outlook = Sunny and Temperature = Mild: 1) Define the events: A = PlayTennis = No B = Outlook = Sunny, Temperature = Mild 2) Count the number of cases where each event occurs: P(A) = Number of cases where PlayTennis = No / Total number of cases = 3/14 P(B) = Number of cases where Outlook = Sunny and Temperature = Mild / Total number of cases = 2/14 3) Count the number of cases where both events occur: P

Uploaded by

lilen74955
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Bayesian modeling

• a statistical approach that uses Bayes' theorem to update our beliefs about
uncertain events or phenomena based on new evidence or data.

• Powerful framework for modeling and inference in various fields, including


statistics, machine learning, and data analysis.

• Versatile and powerful approach for dealing with uncertainty and making
inferences in a probabilistic and principled manner.

• Combines prior knowledge with observed data to update our beliefs and
make informed decisions or predictions.

Bayesian modeling concepts


Bayes Concept
Concept Description
A Mathematical formula, combines prior beliefs (prior probabilities) with observed data
Bayes' Theorem to calculate posterior probabilities.
Estimate the uncertainty associated with these parameters by representing them as
Bayesian Inference probability distributions.
Prior Distribution Expressing prior beliefs about parameters as probability distributions.
Posterior Distribution Updated beliefs about parameters after incorporating observed data.
Likelihood Function Measures data's likelihood under different parameter values.
Markov Chain
Monte Carlo (MCMC) Methods for approximating complex posterior distributions through sampling.
Bayesian Networks Graphical models representing dependencies among variables.

Subjective vs. Subjective Bayes incorporates prior beliefs based on expert knowledge, while Objective
Objective Bayes Bayes aims to use non-informative or minimally informative prior
Used in ML, particularly for Bayesian linear regression, Bayesian NNs, and probabilistic
Bayesian ML programming languages like Stan and Pyro.
Challenges:

• Computationally intensive, especially for high-dimensional


problems.
• Additionally, the choice of prior distributions can influence results,
leading to discussions about prior sensitivity.

Applications:

• Widely applied in various fields, including statistics, physics,


finance, epidemiology, and natural language processing.
• It is used for tasks like parameter estimation, hypothesis testing,
and probabilistic modeling.
Bayes' theorem
• Fundamental concept in probability theory & statistics that
describes how to update our beliefs about an event or hypothesis
based on new evidence or data.

• Named after Thomas Bayes, an 18th-century statistician and


theologian, who developed the theorem.

• Particularly valuable in situations where we want to estimate the


probability of an event given prior knowledge & observed evidence.

General form of Bayes' theorem :


Explanation of the terms in Bayes' theorem:
P(A∣B): conditional probability of event A occurring given that event B has occurred. It
represents our updated belief or probability of A after considering the evidence
provided by B.
P(B∣A): conditional probability of event B occurring given that event A has occurred. It
represents the likelihood of observing B given that A is true. It quantifies how well A
explains B.
P(A): This is the prior probability of event A, which represents our initial belief or
probability of A before considering any new evidence.
P(B): This is the marginal probability of event B, which represents the probability of B
occurring regardless of whether A is true or not. It serves as a normalizing constant.
In simple terms, Bayes' theorem allows us to update our belief in the probability of A
(the "hypothesis" or "event of interest") based on new information provided by B (the
"evidence").
Helps us to combine our prior knowledge with observed data to make more informed
decisions or predictions.
Here's a step-by-step explanation of how Bayes' theorem works:
1. Start with an initial belief represented by P(A), which is our prior
probability.
2. Observe new evidence or data, represented by event B.
3. Calculate P(B∣A), the likelihood of observing the evidence B given that
our initial belief A is true.
4. Use P(A∣B), Bayes' theorem, to update our belief in A based on the
evidence B. This gives us our updated or posterior probability of A.
In essence, Bayes' theorem is a tool for quantifying how our beliefs should
change in light of new information. It's widely used in various fields,
including statistics, ML, and Bayesian modeling, for tasks such as
parameter estimation, hypothesis testing, and making predictions.
Inference and Bayesian Networks
Are related concepts in the field of probabilistic modeling and probabilistic graphical models.
Inference:
Process of drawing conclusions or making predictions based on a given model and observed
evidence or data.

Purpose:
To estimate unknown or unobserved variables, make predictions about future events, or
assess the likelihood of different outcomes given the available information.

Applications:
used in various domains, including--ML, statistics, AI, and decision analysis, to
make informed decisions and predictions.

Ex: used in Bayesian modeling, Bayesian NTs, and other probabilistic models.
Methods:
Common techniques include:

Exact Inference: aim to compute exact probability distribution over variables.


Ex: include variable elimination and belief propagation in graphical models.

Approximate Inference: When exact inference is computationally infeasible,


approximate methods like Monte Carlo methods (e.g., Markov Chain Monte
Carlo or MCMC) are employed to provide close approximations to the true
probabilities.

Variational Inference: Formulates inference as an optimization problem to


approximate the posterior distribution.
Bayesian Networks
Definition:
(BN) is a probabilistic graphical model that represents a set of
variables and their conditional dependencies in the form of a
directed acyclic graph (DAG). Each node in the graph represents a
random variable, and edges between nodes denote probabilistic
relationships.
Purpose:
• Used to model uncertainty and dependencies among variables.
• Provide a structured way to represent & reason about complex
probabilistic relationships.
Components:
Nodes: Each node represents a random variable.
Edges: Directed edges between nodes indicate conditional dependencies.
Conditional Probability Tables (CPTs): Each node has a CPT, specifies the conditional probabilities of
the node given its parent nodes.
Inference in Bayesian NTs: Performing inference to compute probabilities or make predictions.
Common inference tasks include:
- Marginalization: Computing marginal probabilities for specific variables.
- Conditional Probability: Calculating conditional probabilities given evidence.
- Maximum A Posteriori (MAP) Inference: Finding the most likely state of variables given evidence.

Applications:
Widely used in various fields, including healthcare (medical diagnosis), finance (risk
assessment), NLP, & expert systems, where modeling uncertainty and dependencies is crucial.
1. Belief Network
2. Bayes Network
3. Probabilistic Graphical Model (PGM)
Alternative names and terms for BNs: 4. Directed Acyclic Graphical Model (DAG)
5. Causal Network
6. Influence Diagram
7. Statistical Dependency Network
8. Conditional Independence Network
9. Bayesian Belief Network
10. Knowledge Map
11. Markov Network
12. Structural Equation Model (SEM)
In simple manner:
inference is the process of making conclusions or predictions from data and a
given model, while
BNs are a graphical modeling tool used to represent and perform inference in
probabilistic systems with complex dependencies.
Offer a structured way to perform inference, making them valuable in various
applications.
Calculate the Posterior probability of Class NO wrt Sunny
ANS:
Here's a table with fields for
"Outlook,"
"Temperature,"
"Humidity,"
"Windy," and
"PlayTennis."

This table represents weather data and whether tennis was played
("Yes" or "No") based on different weather conditions:
In this table:

• "Outlook" represents the weather conditions


(Sunny, Overcast, Rainy).

• "Temperature" represents the temperature


(Hot, Mild, Cool).

• "Humidity" represents the humidity level


(High, Normal).

• "Windy" represents whether it's windy


(No, Yes).

• "PlayTennis" represents whether tennis was


played (No or Yes) based on the given weather
conditions.
To calculate the posterior probability of the event "PlayTennis = No" with respect to the "Outlook = Sunny" based on the
provided data, we can use Bayes' theorem. We want to find the probability that tennis is not played (No) given that the
outlook is sunny. Here's how we can calculate it:
Bayes' theorem
TO KNOW IF PLAYED/NOT PLAYED WHEN OUTLOOK IS HOT
AND TEMPERATURE IS MILD

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy