ass5_soln
ass5_soln
Assignment- Week 5
TYPE OF QUESTION: MCQ/MSQ
Number of questions: 10 Total mark: 10 X 2 = 20
_____________________________________________________________________________
QUESTION 1:
Suppose a fully-connected neural network has a single hidden layer with 30 nodes. The input is
represented by a 3D feature vector and we have a binary classification problem. Calculate the
number of parameters of the network. Consider there are NO bias nodes in the network.
a. 100
b. 120
c. 140
d. 125
Correct Answer: b
Detailed Solution:
--------------------------------------------------------------------------------------------------------------------
QUESTION 2:
For a binary classification setting, if the probability of belonging to class= +1 is 0.22, what is the
probability of belonging to class= -1 ?
0a.
0.22
b.
0.78
c.
-0.22
d.
Correct Answer: c
Detailed Solution:
In the binary classification setting we keep a single output node which can denote the probability
(p) of belonging to class= +1. So, probability of belonging to class= -1 is (1 - p) since the 2
classes are mutually exclusive.
______________________________________________________________________________
QUESTION 3:
Input to SoftMax activation function is [2,4,6]. What will be the output?
a. [0.11,0.78,0.11]
b. [0.016,0.120, 0.864]
c. [0.045,0.910,0.045]
d. [0.21, 0.58,0.21]
Correct Answer: b
Detailed Solution:
______________________________________________________________________________
QUESTION 4:
A 3-input neuron has weights 1, 0.5, 2. The transfer function is linear, with the constant of
proportionality being equal to 2. The inputs are 2, 20, 4 respectively. The output will be:
a. 40
b. 20
c. 80
d. 10
Correct Answer: a
Detailed Solution:
In order to find out the output, we multiply the weights with their respective inputs, add the
results and then further multiply them with their transfer function.
Thus, output= 2*(1*2 + 0.5*20 + 2*4 ) = 40
______________________________________________________________________
QUESTION 5:
Which one of the following activation functions is NOT analytically differentiable for all real
values of the given input
a. Sigmoid
b. Tanh
c. ReLU
d. None of the above
Correct Answer: c
Detailed Solution:
______________________________________________________________________________
QUESTION 6:
Detailed Solution:
In the above figure, when either i1 or i2 is 1, output is zero. When both i1 and i2 is 0,
output is 1, When both i1 and i2 is 1, output is 0. This is NOR logic.
______________________________________________________________________________
QUESTION 7:
In a simple MLP model with 10 neurons in the input layer, 100 neurons in the hidden layer and
1 neuron in the output layer. What is the size of the weight matrices between hidden output
layer and input hidden layer?
a. [10x1] , [100 X 2]
b. [100x1] , [ 10 X 1]
c. [100 X 10], [10 x 1]
d. [10 x 1] , [100 X 10]
Correct Answer: d
Detailed Solution:
The size of weights between any layer 1 and layer 2 Is given by [nodes in layer 1 X nodes in layer
2]
______________________________________________________________________________
QUESTION 8:
Consider a fully connected neural network with input, one hidden layer, and output layer with
40, 2, 1 nodes respectively in each layer. What is the total number of learnable parameters (no
biases)?
a. 2
b. 82
c. 80
d. 40
Correct Answer: b
Detailed Solution:
Number of learnable parameters are weights and bias. Given there are no bias nodes. For
fully connected network, since each node is connected :
QUESTION 9:
You want to build a 10-class neural network classifier, given a cat image, you want to classify
which of the 10 cat breeds it belongs to. Which among the 4 options would be an appropriate
loss function to use for this task?
Correct Answer: a
Detailed Solution:
Out of the given options, Cross Entropy Loss is well suited for classification problems which is
the end task given in the question.
______________________________________________________________________________
QUESTION 10:
Correct Answer: b
Detailed Solution:
______________________________________________________________________________
************END*******