What Is Naive Bayes?
What Is Naive Bayes?
What Is Naive Bayes?
Example: For example, a fruit may be considered to be an apple if it is red, round, and about 3 inches
in diameter. Even if these features depend on each other or upon the existence of the other features, all
of these properties independently contribute to the probability that this fruit is an apple and that is why
it is known as ‘Naive’
Naive
Bayes
The Naive Bayes classifier assumes that the presence of a feature in a class is unrelated to any
other feature. Even if these features depend on each other or upon the existence of the other
features, all of these properties independently contribute to the probability that a particular fruit
is an apple or an orange or a banana, and that is why it is known as "Naive."
Bayes theorem provides a way of calculating posterior probability P(c|x) from P(c), P(x) and P(x|c).
Look at the equation below:
Above,
P(c|x) is the posterior probability of class (c, target) given predictor (x, attributes).
P(c) is the prior probability of class.
P(x|c) is the likelihood which is the probability of predictor given class.
P(x) is the prior probability of predictor.
Let’s understand it using an example. Below I have a training data set of weather and
corresponding target variable ‘Play’ (suggesting possibilities of playing). Now, we need
to classify whether players will play or not based on weather condition. Let’s follow the
below steps to perform it.
Step 1: Convert the data set into a frequency table
Step 2: Create Likelihood table by finding the probabilities like Overcast probability =
0.29 and probability of playing is 0.64.
Step 3: Now, use Naive Bayesian equation to calculate the posterior probability for each
class. The class with the highest posterior probability is the outcome of prediction.
For this reason, P(H) is called the prior probability, while P(H|E) is called the posterior
probability. The factor that relates the two, P(H|E)/P(E), is called the likelihood ratio. Using
these terms, Bayes' theorem can be rephrased as:
"The posterior probability equals the prior probability times the likelihood ratio."
A little confused? Don't worry. Let's continue our Naive Bayes tutorial and understand this
concept with a simple concept.
Now, putting together all the values in the Bayes' Equation, we get a result of 1/3.
First, we will create a frequency table using each attribute of the dataset.
Outlook = Rain
Humidity = High
Wind = Weak
Play = ?
So, with the data, we have to predict wheter "we can play on that day or not."
Our model predicts that there is a 55% chance there will be a game tomorrow.