Part Five - Extra PDF
Part Five - Extra PDF
Part Five - Extra PDF
Supervised Learning
Classification & Prediction
PART 5- EXTRA
3
DECISION TREE
▪ An internal node is a test on an attribute.
▪ A branch represents an outcome of the test, e.g.,
Color=red.
▪ A leaf node represents a class label or class label
distribution.
▪ At each node, one attribute is chosen to split
training examples into distinct classes as much
as possible
▪ A new case is classified by following a matching
path to a leaf node.
4
Weather Data: Play or not Play?
Outlook Temperature Humidity Windy Play?
sunny hot high false No
sunny hot high true No
overcast hot high false Yes
rain mild high false Yes
rain cool normal false Yes
rain cool normal true No
overcast cool normal true Yes
sunny mild high false No
sunny cool normal false Yes
rain mild normal false Yes
sunny mild normal true Yes
overcast mild high true Yes
overcast hot normal false Yes
rain mild high true No
5
Example Tree for “Play?”
Outlook
sunny
overcast rain
Humidity Yes
Windy
No Yes No Yes
6
Building Decision Tree [Q93]
7
Choosing the Splitting Attribute
8
Which attribute to select?
9
A criterion for attribute selection
11
*Claude Shannon “Father of
information theory”
Born: 30 April 1916
Died: 23 February 2001
Claude Shannon, who has died aged 84, perhaps
more than anyone laid the groundwork for today’s
digital revolution. His exposition of information
theory, stating that all information could be
represented mathematically as a succession of
noughts and ones, facilitated the digital
manipulation of data without which today’s
information society would be unthinkable.
Shannon’s master’s thesis, obtained in 1940 at MIT,
demonstrated that problem solving could be
achieved by manipulating the symbols 0 and 1 in a
process that could be carried out automatically with
electrical circuitry. That dissertation has been
hailed as one of the most significant master’s
theses of the 20th century. Eight years later,
Shannon published another landmark paper, A
Mathematical Theory of Communication, generally
taken as his most important scientific contribution.
Shannon applied the same radical approach to cryptography research, in which he later
became a consultant to the US government.
Many of Shannon’s pioneering insights were developed before they could be applied in
practical form. He was truly a remarkable man, yet unknown to most of the world.
12
Example: attribute “Outlook”, 1
Outlook Temperature Humidity Windy Play?
sunny hot high false No
sunny hot high true No
overcast hot high false Yes
rain mild high false Yes
rain cool normal false Yes
rain cool normal true No
overcast cool normal true Yes
sunny mild high false No
sunny cool normal false Yes
rain mild normal false Yes
sunny mild normal true Yes
overcast mild high true Yes
overcast hot normal false Yes
rain mild high true No
13
Example: attribute “Outlook”, 2
▪ “Outlook” = “Sunny”:
info([2,3]) = entropy(2/5,3/5) = −2 / 5 log( 2 / 5) − 3 / 5 log(3 / 5) = 0.971 bits
Note: log(0) is
▪ “Outlook” = “Overcast”: not defined, but
info([4,0]) = entropy(1,0) = −1log(1) − 0 log(0) = 0 bits we evaluate
0*log(0) as zero
▪ “Outlook” = “Rainy”:
info([3,2]) = entropy(3/5,2/5) = −3 / 5 log(3 / 5) − 2 / 5 log( 2 / 5) = 0.971 bits
15
Example: attribute “Humidity”
▪ “Humidity” = “High”:
info([3,4] ) = entropy(3/ 7,4/7) = −3 / 7 log( 3 / 7) − 4 / 7 log( 4 / 7) = 0.985 bits
▪ “Humidity” = “Normal”:
info([6,1] ) = entropy(6/ 7,1/7) = −6 / 7 log( 6 / 7) − 1 / 7 log(1 / 7) = 0.592 bits
▪ Information Gain:
info([9,5] ) - info([3,4] , [6,1]) = 0.940 - 0.788 = 0.152
16
Computing the information gain
▪ Information gain:
(information before split) – (information after split)
gain(" Outlook" ) = info([9,5] ) - info([2,3] , [4,0], [3,2]) = 0.940 - 0.693
= 0.247 bits
18
The final decision tree
19
Highly-branching attributes
21
Weather Data with ID code
ID Outlook Temperature Humidity Windy Play?
A sunny hot high false No
B sunny hot high true No
C overcast hot high false Yes
D rain mild high false Yes
E rain cool normal false Yes
F rain cool normal true No
G overcast cool normal true Yes
H sunny mild high false No
I sunny cool normal false Yes
J rain mild normal false Yes
K sunny mild normal true Yes
L overcast mild high true Yes
M overcast hot normal false Yes
N rain mild high true No
22
Gain ratio
GainRatio(S , A) = Gain(S , A) .
IntrinsicInfo(S , A)
24
Computing the gain ratio
▪ Example: intrinsic information for ID code
info([1,1, ,1) = 14 (−1 / 14 log 1 / 14 ) = 3.807 bits
0.940 bits
gain_ratio (" ID_code") = = 0.246
▪ Example: 3.807 bits
25
Gain ratios for weather data
Outlook Temperature
Info: 0.693 Info: 0.911
Gain: 0.940-0.693 0.247 Gain: 0.940-0.911 0.029
Split info: info([5,4,5]) 1.577 Split info: info([4,6,4]) 1.362
Gain ratio: 0.247/1.577 0.156 Gain ratio: 0.029/1.362 0.021
Humidity Windy
Info: 0.788 Info: 0.892
Gain: 0.940-0.788 0.152 Gain: 0.940-0.892 0.048
Split info: info([7,7]) 1.000 Split info: info([8,6]) 0.985
Gain ratio: 0.152/1 0.152 Gain ratio: 0.048/0.985 0.049
26
Summary
27
Discussion
28