0% found this document useful (0 votes)
16 views4 pages

Machine Learning

This document contains instructions and questions for a Machine Learning exam. It includes 6 questions covering key concepts like: 1) Algorithms for learning from positive examples only and calculating accuracy, F-measure, and kappa from a confusion matrix. 2) Using decision trees and KNN for classification, including finding the root attribute with ID3 and applying KNN to classify new data. 3) Applying principal component analysis (PCA) to reduce the dimensionality of patterns. 4) Maximum likelihood estimation and kernel methods in support vector machines (SVM). 5) Likelihood ratio tests (LRT) and PAC learning, including proving conjunctions are PAC learnable. 6) Applying Markov chains

Uploaded by

Techno Tech
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views4 pages

Machine Learning

This document contains instructions and questions for a Machine Learning exam. It includes 6 questions covering key concepts like: 1) Algorithms for learning from positive examples only and calculating accuracy, F-measure, and kappa from a confusion matrix. 2) Using decision trees and KNN for classification, including finding the root attribute with ID3 and applying KNN to classify new data. 3) Applying principal component analysis (PCA) to reduce the dimensionality of patterns. 4) Maximum likelihood estimation and kernel methods in support vector machines (SVM). 5) Likelihood ratio tests (LRT) and PAC learning, including proving conjunctions are PAC learnable. 6) Applying Markov chains

Uploaded by

Techno Tech
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Course Code : CAT 302 GHXW/MW – 22 / 1632

Fifth Semester B. Tech. ( Computer Science and Engineering /


Artificial Intelligence and Machine Learning ) Examination

MACHINE LEARNING

Time : 3 Hours ] [ Max. Marks : 60

Instructions to Candidates :—
(1) All questions are compulsory.
(2) All questions carry marks as indicated.
(3) Explain your answer with neat sketches, wherever applicable.

1. (a) Write an algorithm which only considers Positive examples to find a hypothesis
w.r.t. training data. 4(CO1)

(b) For the following confusion matrix :

Find :
(i) Accuracy.

(ii) F-Measure.

(iii) Kappa value. 6(CO1)

GHXW/MW-22 / 1632 Contd.


2. (a) Find out the root attribute by using ID3 for constructing decision tree for
the following dataset :

ChoiceBy Range MonthEnd Whether to Buy?

X1 Male 100-500 Yes Yes

X2 Female 500-1000 Yes No

X3 Male 100-500 No Yes

X4 Male 500-1000 Yes No

X5 Female 100-500 No Yes

X6 Female 100-500 Yes Yes

X7 Female 1000-1500 No Yes

X8 Male 1000-1500 No No

X9 Male 500-1000 No No

X10 Female 1000-1500 Yes Yes


6(CO2)
(b) Apply KNN algorithm on following dataset to find the value of "C" when
A = 4 and B = 7 for the value of K = 3 :

A B C

1 6 Yes

4 5 No

6 4 No

2 6 Yes

8 4 Yes 4(CO2)

3. Consider the two dimensional patterns (2, 1), (3, 5), (4, 3), (5, 6), (6, 7), (7, 8).
Compute the principal component using PCA Algorithm. 10(CO2)

GHXW/MW-22 / 1632 2 Contd.


4. (a) Suppose the weights of randomly selected American female college students
are normally distributed with unknown mean µ and standard deviation σ.
A random sample of 10 American female college students yielded the following
weights (in pounds) :
115 122 130 127 149 160 152 138 149 180
Based on the definitions given above, identify the likelihood function and
the maximum likelihood estimator of µ, the mean weight of all American
female college students. Using the given sample, find a maximum likelihood
estimate of µ as well. 6(CO3)
(b) Explain the significance to kernel trick in SVM. Write any TWO non-linear
kernel functions that are used in SVM. 4(CO2)

5. (a) Given the likelihoods below, derive a decision rule based on the LRT
(assume equal priors) :
p(x | w1) = N(4, 1) and p(x | w2) = N(10, 1). 4(CO3)
(b) What is a significance of PAC learning ? Prove that Conjunction of Boolean
literals are PAC learnable. 6(CO3)

6. (a) Consider the following transition matrix :


Coke Pepsi
Coke

Pepsi
P=
( 0.6

0.4
0.3

0.8
(
Apply Markov chain algorithm on above matrix and find :
(a) The probability that the person will buy Coke at the third
purchase from now. Given that he is currently a Coke Purchaser.
(b) The probability that the person will buy Pepsi at the fourth
purchase from now. Given that he is currently a Coke Purchaser.
(c) The probability that the person will buy Pepsi at the fourth
purchase from now. Given that he is currently a Pepsi Purchaser.
6(CO4)

GHXW/MW-22 / 1632 3 Contd.


(b) Cluster the following eight points (with (x, y) representing locations) into
three clusters :
A1(2, 10), A2(2, 5), A3(8, 4), A4(5, 8), A5(7, 5), A6(6, 4), A7(1, 2), A8(4, 9)
Initial cluster centers are : A1(2, 10), A4(5, 8) and A7(1, 2).
The distance function between two points a = (x1, y1) and b = (x2, y2)
is defined as —
P(a, b) = |x2 – x1| + |y2 – y1|
Use K - Means Algorithm to find the three cluster centers after the first
iteration. 4(CO4)

GHXW/MW-22 / 1632 4 25

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy