Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
27 views
26 pages
Decision Tree
Decision tree from machine learning
Uploaded by
sachindrachinu01
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save DecisionTree For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
27 views
26 pages
Decision Tree
Decision tree from machine learning
Uploaded by
sachindrachinu01
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save DecisionTree For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 26
Search
Fullscreen
Iterative Dichotomiser 3 (103) .>40SUN Ve A lyon. in measure to select among the candidate attributes at each step while growing the tree. S isa collection of 14 examples of a Boolean concept, including 9 positive and 5 negative examples [9+, 5] Then the entropy of S relative to this Boolean classification is: Entropy((9+,5—}) = —(9/14) log, (9/14) — (5/14) log, (5/14) = 0.9406 ey Pent x Swlere so) sso + oe a oid +! \we = of ) foeID3(Examples, Target.attribute, Attributes) Examples are the training examples. Target attribute is the attribute whose value is to be predicted by the tree. Attributes isa list of other attributes that may be tested by the learned decision tree, Returns a decision tree that correctly classifies the given Examples. # Create a Root node for the tree « If all Examples are positive, Return the single-node tree Root, with label « If all Examples are negative, Return the single-node tree Root, with label « If Attributes is empty, Return the single-node tree Root, with label = most common value of Target-attribute in Examples Otherwise Begin © A < the attribute from Attributes that best" classifies Examples ‘© The decision attribute for Root — A # For each possible value, vi, of A. @ Add a new tree branch below Root, corresponding to the test A =v, © Let Examples, be the subset of Examples that have value v for A @ If Examptesy, is empty Then below this new branch add a leaf node with label = most common value of Target attribute in Examples «© Else below this new branch add the subtree ID3(Examples,,, Target-attribute, Attributes — {A})) ° End ¢ Return RootWhich attribute is the best classifier? E((29+,35-]}) = 0.99 Es D= 0.99 ‘A? 2 sah [8+, 30-] [i8+, 434 (1+, 2+] Gain(S,B) = Entropy(S) -51/64*Entropy([18+,33-]) -13/64*Entropy({11+,2-])aun (OS ¥en emer =—f).4\_-/ 24 0-4 (426 t8e6)WHICH AULEIDULE IS THE DEST ChASSITICE . E((29+,35-]) = 0.99 E((29+,35-]) = 0.99 A? B? af Ne YN [21+, 5-] [8+, 30-] [18+, 33-] [4,24] E((21+,5-))=0.71 — E({8+,30-]) = 0.74 E({18+,33-]) = 0.94 E({11+,2-]) = 0.62 Gain(S,A) = Entropy(S) Gain(S,B) = Entropy(S) -26/64*Entropy((21+,5-]) -51/64*Entropy({18+,33-]) -38/64*Entropy([8+,30-]) -13/64*Entropy({11+,2-]) =0.27 = 0.12 2 A provides greater information gain thanB, (1, VW) .4 bb ’ } 7 s Ais a better classifier than B.PlayTennis w— INITAL Day Outlook Temperature Di Sunny —— Hot D2 Sunny———_ Hot D3 Overcast —— Hot You D4 Rain Mila wa DS Rain Coot Yes D6 Rain Coot Ne T D7 Overcast Cool Ye S Ds Sunny— Mild Ne be Sunny Coot Yo DIO “Rain Mild Yer Dit Sunny = Mila _ 5 DI2 Overcast—= Mild Ya DIZ Overcast —= Hot DI4 Rain Mila a (Outlook = Sunny Humidity = Normal) Y (Cutlook = Overcast) (Outlook = Rain « Wind = Weak) Sunny Overcast” Rain igh Normal 7 xDay Outlook Temperature Humidity Wind |PlayTennis DI Sunny Hot High — Weak No D2 Sunny Hot High Strong] No D3 Overcast Hot High Weak | Yes D4 Rain Mild High Weak | Yes Ds Rain Cool Normal Weak | Yes D6 Rain Cool Normal Strong| No D7 Overcast Cool Normal Strong| Yes Ds Sunny Mild High Weak No D9 Sunny Cool Normal Weak | Yes D10_ Rain Mild Normal Weak | Yes Dil Sunny Mild Normal Strong| Yes D12 Overcast Mild High Strong} Yes D13. Overcast Hot Normal Weak | Yes Di4__ Rain Mild High Strong] No 14 cases 9 positive cases * Step 1: Calculate entropy for all cases: Now =9 ~ £/d Nie _FHIS) = (9/14)*10g,(9/14) -(5/14)*log,(5/14)Ente population instances) Balance < 50k Balance 2 50K 1am =082 r)=1001=008Day Outlook Temperature Humidity Wind |PlayTennis Di D2 D3 D4 DS D6 D7 Ds D9 D10 Di Di2 p13 D4 es Sunny Hot High Sunny____ Hot High Overcast Hot High Rain Mild High Rain—— Cool Normal Rain—— Cool Normal Overcast-——Cool Normal Sunny__—— M High Sunny—— Cool Normal Rain Mild Normal Sunny ———-Mild Normal Overcast —~ Mild High Overcast——— Hot Normal Rain —— Mild High Weak Weak Weak Strong Strong Weak StrongStep 2:_Loop over all attributes, calculate gain: Day — Attribute = Outlook DI + Loop over values of Outlook > Outlook = Sunny Noo 2 Nyeg=3 Nrw=5 Ds burn \ = (2/5)*1og,(2/5) - (3/5)*log,(3/5) = 0.971 De Outlodl 4 Overcast by Nros Nx. Nru=4 be £ (Darcard ~{4/4)*10g,4/4) - (0/4)*log,(0/4) = 0.00 D9 Outlook = Rain Dio Npos=3 Nyeg= 2 Nye Du on). = -3/5)*l0g,(3/5) - (2/5)*log,(2/5) = 0.971 Pa 3 ‘+ Calculate Information Gain for attribute Outlook yg Gain(S,0utlook) ECS) ~ Neum/Nra* ECSunny) = Noyed/Nro(Overcast) oy SE Gain(S, Outlook) =" (5/14)*0.971 - (4/14)*0 - (5/14)*0.971 Gain(S, Outlook) = 0.246 Outlook Sunny Temperature Humility Hot Hot Hot Mild Cool Cool Cool Mild Coot Mild Mild Mild Hot Mild Wind PlayTe Weatheme No Strong No Weak Yes Weak— Yes Weak Yes Suong No Strong Yes Weak" No Weak— Yes Weak YesAttribute = Temperature + (Repeat process looping over {Hot, Mild, Cool}) Gain(S, Temperature) = 0.029 Attribute = Humidity + (Repeat process looping over {High, Normal}) Gain(S, Humidity) = 0929 O-°VS — Attribute = Wind. (Repeat process looping over {Weak, Strong}) Gain(S, Wind) = 0.048 Find attribute with greatest information gain: Gain(S, Outlook) = 0.246, Gain(S, Temperature) = 0.029 Gain(S, Humidity) = 0.029, Gain(S, Wind) = 0.048 0.191 «. Outlook is root node of treeDudek, VS Vorypeagrne, a ee a a Complicated Tree ee ot re Simple Tree a <= moderate pene Outlook Outlook Windy yee Zs mes wN we Lesion —~ we overcast “NY wy overcast a oe a’ Wi 7 a - en . a ee ae nA 7 i < S x Bere oe res as . : ri 7 a Oe Al nome NP) ON) Wa) oo eas Poi aS eae N P N Pe ‘nullnny nny ain ain nny nny ain nny east Temperature Humidity Hot Hot Hot Mild Coot Cool Cool Mild Coot Mild Mild Mild Hot Mild High High High High Normal Normal Normal High Normal Normal Normal High Normal High Wind Weak Strong Weak Weak Weak Strong Strong Weak Weak Weak Strong Strong Weak Strong PlayTennis No No Yes Yes ‘Yes No Yes No Yes Yes Yes Yes NoTennis | Which attribute should be tested here? ees “_ BL Samy Hot Wea We D2 Siuy Hot ign Seong D3 Owremt Hot High Weak You Dy Rue MM High Weak Yon DS Ran Cash. Nomal Weak Yo. D6 Ran Gaak_——‘Namal_—Stomg— No D7 Owreat —Geel_——Nornal Sheng Yer Ds ‘Smuy Mil High Weak Ne Ds Sau) Cast Komal Weak You bin “Rani ‘Nomal Weak Yon ON Say ML_=Nomal Stone You Diz Ove Mill igh Stomp Yon Z O13 Oras Ht Noma Weak You cM fe _ wit _ieh seg 8 _ pi p2.D8.p9.11) +34 2 Pray / ‘ Day | Temp | Humidity | Wind o1 High | Weak | No 02 High | Strong | No 08 High | Weak | No 09 Normal | Weak | Yes bil Normal | Strong | Yes {(D3.D7.D12.D13} [4404 © {D4,D5.D6.D10.D14) 424 2 Play Day | Temp) Humidity | Wind | tt 0a [mild | High | Weak | Yes 05 | Cool | Normal | Weak | Yes 06 | Cool | Normal | Strong| No ‘010 | mild | Normal | Weak | Yes 014 | Mild | High | Strong| No1D3 -S sunny Gain(S, Humidity) = 0.970-(3/5)0.0 — 2/5(0.0) = 0.970 sunny Gain(Sguyqy , Temp.) = 0.970-(2/5)0.0 ~2/5(1.0)-(1/5)0.0 = 0.570 Gain(Sguyqy , Wind) = 0.970= -(2/5)1.0 ~ 3/5(0.918) = 0.019 sunny > So, Hummudity will be selecteday pL b2 D3 Ds D7 D8 bo D10 Du Di2 D13 Dia ante ‘Sunny Sunny Rain Rain Rain Overcast ‘Sunny Sunny Rain Sunny ‘Overcast Overcast Rain femperanure Hot Hot Mild Coot Cool Cool Mild Mild Mild Mild Hot Mild aay High High Normal Normal Normal High Normal Normal Normal High Normal Ee Weak Strong Weak Weak Weak Strong Strong Weak Weak Weak Strong ‘Strong Weak Strong BAGS eS No No Yes Yes Yes No Yes No Yes Yes Yes Yes No High] [Normal [D1.D2) Yes [D8.D9.D11) ID3 - Result Outlook Pa Overcast Yes [D3,D7,D12,D13] St) er] Strong | [Weak [D6,D14) [D4,D5,D10]Inductive Bias of ID3: — Shorter trees are preferred over longer trees. — Trees that place high information gain attributes close to the root LS are preferred over those that do not. amotcis pn in [ee ) Prfoone Pe aaaInductive Bias in ID3 - Occam’s Razor OCCAM'S RAZOR: Prefer the simplest hypothesis that fits the data. The answer that requires the fewest assumptions is generally the correct one. Why prefer short hypotheses? Argument in favor: — Fewer short hypotheses than long hypotheses ~ A short hypothesis that fits the data is unlikely to be a coincidence — A long hypothesis that fits the data might be a coincidence Argument opposed: — There are many ways to define small sets of hypotheses — Whaat is so special about small sets based on size of hypothesisOckham’s Razo Why did the Two Explanations tree fall down? 5 1. The wind knocked down Sw 1 the tree. 2. Two meteorites. One hit ¥ = the tree and knocked it V ’ down. Then it hit the other bs meteorite, thus obliterating Dale evidence of its existence.Decision Tree Advantages rr 1 Inexpensive to construct 10 Extremely fast at classifying unknown records = O(d) where d is the depth of the tree 1D Presence of redundant attributes does not adversely affect the accuracy of decision trees = One of the two redundant attributes will not be used for splitting once the other attribute is chosen 1 Nonparametric approach Does not require any prior assumptions regarding proba! distributions, means, variances, etc. —Easy to interpret for small-sized trees 3 Robust to noise (especially when methods to avoid overfiting are elnployed, —Can easily handle redundant or irrelevant attributes (unless the attributes are interacting)
You might also like
Classification - Issues Regarding Classification and Prediction
PDF
No ratings yet
Classification - Issues Regarding Classification and Prediction
42 pages
Decision Tree
PDF
100% (1)
Decision Tree
10 pages
Play Tennis Example: Outlook Temperature Humidity Windy
PDF
No ratings yet
Play Tennis Example: Outlook Temperature Humidity Windy
29 pages
Machine Learning - Part 1
PDF
100% (1)
Machine Learning - Part 1
80 pages
Module 3-Decision Tree Learning
PDF
100% (1)
Module 3-Decision Tree Learning
33 pages
Chapter 3 Decision Trees
PDF
No ratings yet
Chapter 3 Decision Trees
61 pages
Module 3 DecisionTree Notes
PDF
100% (1)
Module 3 DecisionTree Notes
14 pages
L5 - Decision Tree - B
PDF
No ratings yet
L5 - Decision Tree - B
51 pages
W7-8 - Decision Trees
PDF
No ratings yet
W7-8 - Decision Trees
81 pages
Unit 3
PDF
No ratings yet
Unit 3
81 pages
Decisiontrees
PDF
No ratings yet
Decisiontrees
46 pages
Decision Tree Learning
PDF
No ratings yet
Decision Tree Learning
70 pages
2.decision Tree
PDF
No ratings yet
2.decision Tree
56 pages
ML Intro
PDF
No ratings yet
ML Intro
45 pages
03-FSSR DS610 2024 2025T1 DT
PDF
No ratings yet
03-FSSR DS610 2024 2025T1 DT
51 pages
07 - ML - Decision Tree
PDF
No ratings yet
07 - ML - Decision Tree
37 pages
Unit 6 Finalized
PDF
No ratings yet
Unit 6 Finalized
30 pages
Decision Trees: Decision Tree Representation ID3 Learning Algorithm Entropy, Information Gain Overfitting
PDF
No ratings yet
Decision Trees: Decision Tree Representation ID3 Learning Algorithm Entropy, Information Gain Overfitting
33 pages
2.3 Decision-Tree-Algorithm
PDF
No ratings yet
2.3 Decision-Tree-Algorithm
61 pages
Lec-2 Decision Tree - 13-8-2024
PDF
No ratings yet
Lec-2 Decision Tree - 13-8-2024
38 pages
Lecture2 DT
PDF
No ratings yet
Lecture2 DT
75 pages
2.decision Tree
PDF
No ratings yet
2.decision Tree
74 pages
Decision Tree Learning and Inductive Inference
PDF
No ratings yet
Decision Tree Learning and Inductive Inference
37 pages
Simple Learning Algorithms: Jiming Peng, Advol, Cas, Mcmaster 1
PDF
No ratings yet
Simple Learning Algorithms: Jiming Peng, Advol, Cas, Mcmaster 1
41 pages
Decision Tree
PDF
No ratings yet
Decision Tree
42 pages
Outlook Temp Humidity Windy Play
PDF
No ratings yet
Outlook Temp Humidity Windy Play
17 pages
DM UNIT 4b (1R ALGO)
PDF
No ratings yet
DM UNIT 4b (1R ALGO)
39 pages
AIML Lect5 Decision Tree
PDF
No ratings yet
AIML Lect5 Decision Tree
33 pages
ML Unit-3
PDF
No ratings yet
ML Unit-3
29 pages
T6 Decision Tree
PDF
No ratings yet
T6 Decision Tree
38 pages
Decision Tree-Using Entropy
PDF
No ratings yet
Decision Tree-Using Entropy
17 pages
Unit 3
PDF
No ratings yet
Unit 3
90 pages
Unit 2 1
PDF
No ratings yet
Unit 2 1
15 pages
Decision Tree
PDF
No ratings yet
Decision Tree
14 pages
Decision Tree: - Construct A Decision Tree To Classify "Golf Play
PDF
No ratings yet
Decision Tree: - Construct A Decision Tree To Classify "Golf Play
17 pages
Unit 3 MLT
PDF
No ratings yet
Unit 3 MLT
18 pages
07 - Decision Tree
PDF
No ratings yet
07 - Decision Tree
45 pages
Decision Tree & Random Forest
PDF
No ratings yet
Decision Tree & Random Forest
41 pages
Decision Trees
PDF
No ratings yet
Decision Trees
49 pages
7-Decision Trees Learning
PDF
No ratings yet
7-Decision Trees Learning
51 pages
00 Decision Tree Example
PDF
No ratings yet
00 Decision Tree Example
12 pages
Class 16 Decision Tree
PDF
No ratings yet
Class 16 Decision Tree
45 pages
2c Decision Tree Algorithm
PDF
No ratings yet
2c Decision Tree Algorithm
21 pages
DMDW Co3 Session 14
PDF
No ratings yet
DMDW Co3 Session 14
55 pages
3 Decision Trees - LMS
PDF
No ratings yet
3 Decision Trees - LMS
47 pages
ML Lec5
PDF
No ratings yet
ML Lec5
7 pages
ML UNIT-2 Notes
PDF
No ratings yet
ML UNIT-2 Notes
15 pages
Decision Tree Example
PDF
No ratings yet
Decision Tree Example
21 pages
Decision Tree
PDF
No ratings yet
Decision Tree
27 pages
Assigment 2 Ammad Ali
PDF
No ratings yet
Assigment 2 Ammad Ali
8 pages
ID3 Algorithm Machine Learning, Btech Cse
PDF
No ratings yet
ID3 Algorithm Machine Learning, Btech Cse
6 pages
DM DT Solved Example 02 - Unlocked
PDF
No ratings yet
DM DT Solved Example 02 - Unlocked
3 pages
ID3 Decision Tree Explanation
PDF
No ratings yet
ID3 Decision Tree Explanation
8 pages
Brute Force Bayes Algorithm Example
PDF
No ratings yet
Brute Force Bayes Algorithm Example
6 pages
Decision Trees Iterative Dichotomiser 3 (ID3) For Classification: An ML Algorithm
PDF
No ratings yet
Decision Trees Iterative Dichotomiser 3 (ID3) For Classification: An ML Algorithm
7 pages
Decision Tree Id3 Problem
PDF
No ratings yet
Decision Tree Id3 Problem
5 pages
Chapter 5 2018 2019
PDF
No ratings yet
Chapter 5 2018 2019
5 pages
Decision Tree
PDF
No ratings yet
Decision Tree
5 pages