Module 3- Bayesian Classifier (1)
Module 3- Bayesian Classifier (1)
7
Classification Is to Derive the
Maximum Posteriori
Classification is to derive the maximum
posteriori, i.e., the maximal P(Ci|X) This
can be derived from Bayes’ theorem Since
P(X) is constant for all classes, only P (X|
Ci)P(Ci) needs to be maximized.
P(X | C )P(C )
P(C | X) i i P(C | X) P(X | C )P(C )
i P(X) i i i
Naïve Bayes Classifier
A simplified assumption: attributes are conditionally
independent (i.e., no dependence relation between
attributes):
n
P ( X | C i ) P ( x | C i ) P ( x | C i ) P ( x | C i ) ... P ( x | C i )
k 1 2 n
k 1
( x, , )
2
2
based on Gaussian g distribution 2
ewith a mean μ and
standard deviation σ
P ( X | C i ) g ( xk , C , C )
and P(xk|Ci) is
i i
9
How to Predict a class label using naıve
Bayesian classification?
11
P(Ci): P(buys_computer = “yes”) = 9/14
= 0.643
age
P(buys_computer
income studentcredit_rating
= “no”) =
buys_computer
5/14=<=30
0.357
<=30
high
high
no
no
fair
excellent
no
no
31…40 high no fair yes
>40 medium no fair yes
>40 low yes fair yes
>40 low yes excellent no
31…40 low yes excellent yes
<=30 medium no fair no
<=30 low yes fair yes
>40 medium yes fair yes
<=30 medium yes excellent yes
31…40 medium no excellent yes
31…40 high yes fair yes
>40 medium no excellent no
16
Thank you….