Machine Learning CS5011 Assignment #2: Dr. B. Ravindran
Machine Learning CS5011 Assignment #2: Dr. B. Ravindran
Machine Learning CS5011 Assignment #2: Dr. B. Ravindran
Assignment #2
Submission Date: 07 October 2016
Avinash Sharma(CS16D401)
1
Avinash Sharma(CS16D401) Assignment #2
Q1: SVM
You have been provided with training instances for an image classification problem DS2. You
have to train an SVM to classify the test images into either of the following four categories:
coast, forest, inside-city, mountain.
Use the training data to build classification models using the following kernels: Linear, Poly-
nomial, Gaussian and Sigmoid kernel.
Come up with the kernel parameters for the various models. You can use a fraction of data
supplied to do a n-fold cross validation to find the best model parameters.
Linear Kernal
C = 20
Polynomial Kernal
C=2
=1
Degree = 4
Linear Kernal
C = 10
= 0.5
Linear Kernal
C = 20
= 0.1
Page 2 of 8
Avinash Sharma(CS16D401) Assignment #2 Q1: SVM
Implement original back-propagation algorithm. Use DS2 for training your neural network.
Report per-class precision, recall and F-measure on the test data used in Question 1. Now
consider the alternate error function. Derive the gradient descent update rule for this definition
of R. Now train your neural network with this new error function. Report per-class precision,
recall and F-measure on the same test data. What will happen when you vary the value of ?
Vary the value of from 102 to 102 in multiples of 10 and repeat the experiment and report
the results. Can you figure out the effect of in the results? Look at the weights learnt using
the new error function. What do you infer from them?
Gradient Descent Weight Update Rule for L2 normalized is given in Eq. 1 and 2.
R
km km (1 )/N (1)
km
R
km km (1 )/N (2)
km
On increasing the accuracy, precision, recall and F-measure are getting decreased. But at 0.1 and 0.01
values the performance in terms of these performance indices is better in comparison to that without the
regularization. This is due to decrease in the variance of the fit. Further, as we are increasing the the
learned weights are getting reduced as it essentially applies penalty on the weights of the network.
You need to use Weka for this question. We will use Mushroom dataset from UCI ma-
chine learning repository (https://archive.ics.uci.edu/ml/datasets/Mushroom). This is a 2-
class problem with 8124 instances. Use the last 1124 instances as test data and the rest as
training data.
Run J48 Decision Tree algorithm from Weka. Report precision, recall and f1- mea- sure.
What is the effect of MinNumObj on the performance? What happens when you do
reducedErrorPruning?
What are the important features in deciding whether a mushroom is edible or not?
Turn in the Decision Tree learnt by the model (the decision tree with the best perfor-
mance).
Unprunned Tree:
Accuracy = 1
Precision =1
Recall = 1
F1-measue = 1
By default the value of MinNumObj is 2. On incresing its value the performance of tree remains same upto
MinNumObj = 24. After that the performance indices slightly reduces.
Accuracy = 0.994
Precision = 0.985
Recall = 1
F1-measue = 0.993
On using reduced error prunning the performance of tree is not affected by change in confidence interval but
it gets reduced to below mentioned values at 5 folds. Important features are: odor, stalk-shape, spore-print-
color, gill-size, gill-spacing and population.
Q3: Decision Tree (continued)Q3: Decision Tree continued on next page. . . Q3: Decision Tree (continued)Q3:
Decision Tree continued on next page. . .
Q3: Decision Tree (continued)Q3: Decision Tree continued on next page. . . Q3: Decision Tree
Q3: Decision Tree (continued)Q3: Decision Tree continued on next page. . . Page 4 of 8
Avinash Sharma(CS16D401) Q3: Decision Tree
Assignment
(continued)Q3:
#2 Decision Tree continued on next page. . .
Q3: Decision Tree (continued)Q3: Decision Tree continued on next page. . . Page 5 of 8
Avinash Sharma(CS16D401) Q3: Decision Tree
Assignment
(continued)Q3:
#2 Decision Tree continued on next page. . .
Q3: Decision Tree (continued)Q3: Decision Tree continued on next page. . . Page 6 of 8
Avinash Sharma(CS16D401) Q3: Decision Tree
Assignment
(continued)Q3:
#2 Decision Tree continued on next page. . .
Q3: Decision Tree (continued)Q3: Decision Tree continued on next page. . . Page 7 of 8
Avinash Sharma(CS16D401) Q3: Decision Tree
Assignment
(continued)Q3:
#2 Decision Tree continued on next page. . .
Q3: Decision Tree (continued)Q3: Decision Tree continued on next page. . . Page 8 of 8