Data Mining: Ensemble Techniques Introduction To Data Mining, 2 Edition by Tan, Steinbach, Karpatne, Kumar
Data Mining: Ensemble Techniques Introduction To Data Mining, 2 Edition by Tan, Steinbach, Karpatne, Kumar
Data Mining: Ensemble Techniques Introduction To Data Mining, 2 Edition by Tan, Steinbach, Karpatne, Kumar
Ensemble Techniques
Ensemble Methods
General Approach
Original
D Training data
Step 1:
Create Multiple D1 D2 .... Dt-1 Dt
Data Sets
Step 2:
Build Multiple C1 C2 Ct -1 Ct
Classifiers
Step 3:
Combine C*
Classifiers
Bagging
Original Data 1 2 3 4 5 6 7 8 9 10
Bagging (Round 1) 7 8 10 8 2 5 10 10 5 9
Bagging (Round 2) 1 4 9 1 2 3 2 7 3 2
Bagging (Round 3) 1 8 5 10 5 5 9 6 3 7
Bagging Example
Original Data:
x 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
y 1 1 1 -1 -1 -1 -1 1 1 1
True False
yleft yright
2/14/18 Introduction to Data Mining, 2nd Edition 8
Bagging Example
Bagging Example
Bagging Example
● Assume test set is the same as the original data
● Use majority vote to determine class of ensemble
classifier
Round x=0.1 x=0.2 x=0.3 x=0.4 x=0.5 x=0.6 x=0.7 x=0.8 x=0.9 x=1.0
1 1 1 1 -1 -1 -1 -1 -1 -1 -1
2 1 1 1 1 1 1 1 1 1 1
3 1 1 1 -1 -1 -1 -1 -1 -1 -1
4 1 1 1 -1 -1 -1 -1 -1 -1 -1
5 1 1 1 -1 -1 -1 -1 -1 -1 -1
6 -1 -1 -1 -1 -1 -1 -1 1 1 1
7 -1 -1 -1 -1 -1 -1 -1 1 1 1
8 -1 -1 -1 -1 -1 -1 -1 1 1 1
9 -1 -1 -1 -1 -1 -1 -1 1 1 1
10 1 1 1 1 1 1 1 1 1 1
Sum 2 2 2 -6 -6 -6 -6 2 2 2
Predicted Sign 1 1 1 -1 -1 -1 -1 1 1 1
Class
Boosting
Original Data 1 2 3 4 5 6 7 8 9 10
Boosting (Round 1) 7 3 2 8 7 9 4 10 6 3
Boosting (Round 2) 5 4 9 4 2 5 1 7 4 2
Boosting (Round 3) 4 4 8 10 4 5 4 6 3 4
AdaBoost
● Error rate:
N
1
εi =
N
∑ w δ (C ( x ) ≠ y )
j =1
j i j j
1 ⎛ 1 − ε i ⎞
αi = ln⎜⎜ ⎟
2 ⎝ ε i ⎟⎠
● Weight update:
( j +1)
⎧⎪exp−α j if C j ( xi ) = yi
wi( j )
wi = ⎨ α
Zj
⎪⎩ exp j if C j ( xi ) ≠ yi
where Z j is the normalization factor
● If any intermediate r ounds produce error r ate
higher than 50%, the weights are reverted back
to 1/n and the resampling procedure is repeated
● Classification:
T
C * ( x ) = arg max ∑α jδ (C j ( x ) = y )
y j =1
2/14/18 Introduction to Data Mining, 2nd Edition 17
AdaBoost Algorithm
Original Data:
x 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
y 1 1 1 -1 -1 -1 -1 1 1 1
True False
yleft yright
2/14/18 Introduction to Data Mining, 2nd Edition 19
AdaBoost Example
● Summary:
Round Split Point Left Class Right Class alpha
1 0.75 -1 1 1.738
2 0.05 1 1 2.7784
3 0.3 1 -1 4.1195
2/14/18 Introduction to Data Mining, 2nd Edition 20
AdaBoost Example
● Weights
Round x=0.1 x=0.2 x=0.3 x=0.4 x=0.5 x=0.6 x=0.7 x=0.8 x=0.9 x=1.0
1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
2 0.311 0.311 0.311 0.01 0.01 0.01 0.01 0.01 0.01 0.01
3 0.029 0.029 0.029 0.228 0.228 0.228 0.228 0.009 0.009 0.009
● Classification
Round x=0.1 x=0.2 x=0.3 x=0.4 x=0.5 x=0.6 x=0.7 x=0.8 x=0.9 x=1.0
1 -1 -1 -1 -1 -1 -1 -1 1 1 1
2 1 1 1 1 1 1 1 1 1 1
3 1 1 1 -1 -1 -1 -1 -1 -1 -1
Sum 5.16 5.16 5.16 -3.08 -3.08 -3.08 -3.08 0.397 0.397 0.397
Predicted
Sign 1 1 1 -1 -1 -1 -1 1 1 1
Class
Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.
Alternative Proxies: