LM39 - Naïve Bayes Models

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 14

KGiSL Institute of Technology

(Approved by AICTE, New Delhi; Affiliated to Anna University, Chennai)


Recognized by UGC, Accredited by NBA (IT)
365, KGiSL Campus, Thudiyalur Road, Saravanampatti, Coimbatore – 641035.

Department of Artificial Intelligence & Data Science


Name of the Faculty : Mr.MOHANRAJ.S

Subject Name & Code : AL3391 / ARTIFICIAL INTELLIGENCE

Branch & Department : B.Tech & AI&DS

Year & Semester : 2022 / III

Academic Year : 2022-23


UNIT I INTELLIGENT AGENTS 9
Introduction to AI – Agents and Environments – concept of rationality – nature of environments – structure of
agents. Problem solving agents – search algorithms – uninformed search strategies.

UNIT II PROBLEM SOLVING 9


Heuristic search strategies – heuristic functions. Local search and optimization problems – local search in
continuous space – search with non-deterministic actions – search in partially observable environments – online
search agents and unknown environments

UNIT III GAME PLAYING AND CSP 9


Game theory – optimal decisions in games – alpha-beta search – montecarlo tree search –stochastic games –
partially observable games. Constraint satisfaction problems – constraint propagation – backtracking search for
CSP – local search for CSP – structure of CSP.

UNIT IV LOGICAL REASONING 9


Knowledge-based agents – propositional logic – propositional theorem proving – propositional model checking –
agents based on propositional logic. First-order logic – syntax and semantics – knowledge representation and
engineering – inferences in first-order logic – forward chaining – backward chaining – resolution.

UNIT V PROBABILISTIC REASONING 9


Acting under uncertainty – Bayesian inference – naïve Bayes models. Probabilistic reasoning – Bayesian networks –
exact inference in BN – approximate inference in BN – causal networks.
SYLLABUS

UNIT V PROBABILISTIC REASONING

Acting under uncertainty – Bayesian inference – naïve Bayes models.

Probabilistic reasoning – Bayesian networks – exact inference in BN –

approximate inference in BN – causal networks.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


Course Outcomes

At the end of this course, the students will be able to:


CO1: Explain intelligent agent frameworks
CO2: Apply problem solving techniques
CO3: Apply game playing and CSP techniques
CO4: Perform logical reasoning
CO5: Perform probabilistic reasoning under uncertainty

AL3391/AI/II AI&DS/III SEM/KG-KiTE


NAÏVE BAYES MODELS

 The Naïve Bayes algorithm is comprised of two words Naïve and Bayes, Which can be described

as:

 Naïve: It is called Naïve because it assumes that the occurrence of a certain feature is

independent of the occurrence of other features. Such as if the fruit is identified on the bases of

color, shape, and taste, then red, spherical, and sweet fruit is recognized as an apple. Hence each

feature individually contributes to identify that it is an apple without depending on each other.

 Bayes: It is called Bayes because it depends on the principle of Bayes' Theorem.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


NAÏVE BAYES MODELS

•Naïve Bayes algorithm is a supervised learning algorithm, which is based on Bayes theorem and

used for solving classification problems.

•It is mainly used in text classification that includes a high-dimensional training dataset.

•Naïve Bayes Classifier is one of the simple and most effective Classification algorithms which helps

in building the fast machine learning models that can make quick predictions.

•It is a probabilistic classifier, which means it predicts on the basis of the probability of an object.

•Some popular examples of Naïve Bayes Algorithm are spam filtration, Sentimental analysis, and

classifying articles. AL3391/AI/II AI&DS/III SEM/KG-KiTE


NAÏVE BAYES MODELS

•Bayes' theorem is also known as Bayes' Rule or Bayes' law, which is used to determine the

probability of a hypothesis with prior knowledge. It depends on the conditional probability.

•The formula for Bayes' theorem is given as:

Where,
P(A|B) is Posterior probability: Probability of hypothesis A on the observed event B.
P(B|A) is Likelihood probability: Probability of the evidence given that the probability of a hypothesis is true.
P(A) is Prior Probability: Probability of hypothesis before observing the evidence.
P(B) is Marginal Probability: Probability of Evidence.
AL3391/AI/II AI&DS/III SEM/KG-KiTE
NAÏVE BAYES MODELS

Working of Naïve Bayes' Classifier: (can be understood with the help of the below example):

•Suppose we have a dataset of weather conditions and corresponding target variable "Play". So using

this dataset we need to decide that whether we should play or not on a particular day according to the

weather conditions.

•So to solve this problem, we need to follow the below steps:

1. Convert the given dataset into frequency tables.

2. Generate Likelihood table by finding the probabilities of given features.

3. Now, use Bayes theorem to calculate the posterior probability.


AL3391/AI/II AI&DS/III SEM/KG-KiTE
NAÏVE BAYES MODELS

Problem: If the weather is sunny, then the Player should play or not?
Solution: To solve this, first consider the below dataset:

Outlook Play

0 Rainy Yes
1 Sunny Yes
2 Overcast Yes
3 Overcast Yes
4 Sunny No
5 Rainy Yes
6 Sunny Yes
7 Overcast Yes
8 Rainy No
9 Sunny No
10 Sunny Yes
11 Rainy No
12 Overcast Yes
13 AL3391/AI/II
Overcast AI&DS/III SEM/KG-KiTE
Yes
NAÏVE BAYES MODELS
Frequency table for the Weather Conditions: Likelihood table weather condition:

Weather Weather No Yes


Yes No
Overcast 5 0 Overcast 0 5 5/14= 0.35

Rainy 2 2 Rainy 2 2 4/14=0.29

Sunny 3 2 Sunny 2 3 5/14=0.35

Total 10 4 All 4/14=0.29 10/14=0.71

AL3391/AI/II AI&DS/III SEM/KG-KiTE


NAÏVE BAYES MODELS

Applying Bayes'theorem:

P(Yes|Sunny)= P(Sunny|Yes)*P(Yes)/P(Sunny)

P(Sunny|Yes)= 3/10= 0.3

P(Sunny)= 0.35

P(Yes)=0.71

So P(Yes|Sunny) = 0.3*0.71/0.35= 0.60

AL3391/AI/II AI&DS/III SEM/KG-KiTE


NAÏVE BAYES MODELS

Applying Bayes'theorem:..

P(No|Sunny)= P(Sunny|No)*P(No)/P(Sunny)

P(Sunny|NO)= 2/4=0.5

P(No)= 0.29

P(Sunny)= 0.35

So P(No|Sunny)= 0.5*0.29/0.35 = 0.41

P(Yes|Sunny)= 0.60 , P(No|Sunny)= 0.41

So that P(Yes|Sunny)>P(No|Sunny)
AL3391/AI/II AI&DS/III SEM/KG-KiTE
Hence on a Sunny day, Player can play the game.
NAÏVE BAYES MODELS

Advantages of Naïve Bayes Classifier:

•Naïve Bayes is one of the fast and easy ML algorithms to predict a class of datasets.

•It can be used for Binary as well as Multi-class Classifications.

•It performs well in Multi-class predictions as compared to the other Algorithms.

•It is the most popular choice for text classification problems.

Disadvantages of Naïve Bayes Classifier:

•Naive Bayes assumes that all features are independent or unrelated, so it cannot learn the relationship

between features. AL3391/AI/II AI&DS/III SEM/KG-KiTE


NAÏVE BAYES MODELS

Applications of Naïve Bayes Classifier:

•It is used for Credit Scoring.

•It is used in medical data classification.

•It can be used in real-time predictions because Naïve Bayes Classifier is an eager learner.

•It is used in Text classification such as Spam filtering and Sentiment analysis.

AL3391/AI/II AI&DS/III SEM/KG-KiTE

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy