0% found this document useful (0 votes)
2 views

ass5_soln

This document contains a Deep Learning assignment consisting of 10 multiple-choice questions focused on neural networks, binary classification, activation functions, and perceptrons. Each question includes options, the correct answer, and a detailed solution explaining the reasoning behind the answer. The assignment tests knowledge on parameters, probabilities, activation functions, and loss functions relevant to neural network architectures.

Uploaded by

Revathi S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

ass5_soln

This document contains a Deep Learning assignment consisting of 10 multiple-choice questions focused on neural networks, binary classification, activation functions, and perceptrons. Each question includes options, the correct answer, and a detailed solution explaining the reasoning behind the answer. The assignment tests knowledge on parameters, probabilities, activation functions, and loss functions relevant to neural network architectures.

Uploaded by

Revathi S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Deep Learning

Assignment- Week 5
TYPE OF QUESTION: MCQ/MSQ
Number of questions: 10 Total mark: 10 X 2 = 20

_____________________________________________________________________________

QUESTION 1:
Suppose a fully-connected neural network has a single hidden layer with 30 nodes. The input is
represented by a 3D feature vector and we have a binary classification problem. Calculate the
number of parameters of the network. Consider there are NO bias nodes in the network.

a. 100
b. 120
c. 140
d. 125

Correct Answer: b

Detailed Solution:

Number of parameters = (3 * 30) + (30 * 1) = 120

--------------------------------------------------------------------------------------------------------------------

QUESTION 2:
For a binary classification setting, if the probability of belonging to class= +1 is 0.22, what is the
probability of belonging to class= -1 ?

0a.
0.22
b.
0.78
c.
-0.22
d.
Correct Answer: c

Detailed Solution:

In the binary classification setting we keep a single output node which can denote the probability
(p) of belonging to class= +1. So, probability of belonging to class= -1 is (1 - p) since the 2
classes are mutually exclusive.
______________________________________________________________________________

QUESTION 3:
Input to SoftMax activation function is [2,4,6]. What will be the output?

a. [0.11,0.78,0.11]
b. [0.016,0.120, 0.864]
c. [0.045,0.910,0.045]
d. [0.21, 0.58,0.21]

Correct Answer: b

Detailed Solution:

Apply definition of softmax for the given inputs

______________________________________________________________________________

QUESTION 4:
A 3-input neuron has weights 1, 0.5, 2. The transfer function is linear, with the constant of
proportionality being equal to 2. The inputs are 2, 20, 4 respectively. The output will be:
a. 40
b. 20
c. 80
d. 10
Correct Answer: a

Detailed Solution:

In order to find out the output, we multiply the weights with their respective inputs, add the
results and then further multiply them with their transfer function.
Thus, output= 2*(1*2 + 0.5*20 + 2*4 ) = 40
______________________________________________________________________

QUESTION 5:
Which one of the following activation functions is NOT analytically differentiable for all real
values of the given input

a. Sigmoid
b. Tanh
c. ReLU
d. None of the above
Correct Answer: c

Detailed Solution:

ReLu(x) is not differentiable at x = 0, where x is the input to the ReLu layer.

______________________________________________________________________________

QUESTION 6:

Which function do the following perceptron realize?


a. NAND
b. NOR
c. AND
d. OR
Correct Answer: b

Detailed Solution:

In the above figure, when either i1 or i2 is 1, output is zero. When both i1 and i2 is 0,
output is 1, When both i1 and i2 is 1, output is 0. This is NOR logic.

______________________________________________________________________________

QUESTION 7:

In a simple MLP model with 10 neurons in the input layer, 100 neurons in the hidden layer and
1 neuron in the output layer. What is the size of the weight matrices between hidden output
layer and input hidden layer?
a. [10x1] , [100 X 2]
b. [100x1] , [ 10 X 1]
c. [100 X 10], [10 x 1]
d. [10 x 1] , [100 X 10]
Correct Answer: d

Detailed Solution:

The size of weights between any layer 1 and layer 2 Is given by [nodes in layer 1 X nodes in layer
2]

______________________________________________________________________________

QUESTION 8:
Consider a fully connected neural network with input, one hidden layer, and output layer with
40, 2, 1 nodes respectively in each layer. What is the total number of learnable parameters (no
biases)?

a. 2
b. 82
c. 80
d. 40

Correct Answer: b

Detailed Solution:

Number of learnable parameters are weights and bias. Given there are no bias nodes. For
fully connected network, since each node is connected :

Thus it will be (40*2)+(2*1) =82.

QUESTION 9:
You want to build a 10-class neural network classifier, given a cat image, you want to classify
which of the 10 cat breeds it belongs to. Which among the 4 options would be an appropriate
loss function to use for this task?

a. Cross Entropy Loss


b. MSE Loss
c. SSIM Loss
d. None of the above

Correct Answer: a

Detailed Solution:

Out of the given options, Cross Entropy Loss is well suited for classification problems which is
the end task given in the question.
______________________________________________________________________________

QUESTION 10:

-connected neural network with 5 hidden layers, each with 10 hidden


units. The input is 20-dimensional and the output is a scalar. What is the total number of
trainable parameters in your network? There is no bias.

a. (20+1)*10 + (10+1)*10*4 + (10+1)*1


b. (20)*10 + (10)*10*4 + (10)*1
c. (20)*10 + (10)*10*5 + (10)*1
d. (20+1)*10 + (10+1)*10*5 + (10+1)*1

Correct Answer: b

Detailed Solution:

Option (b) explains the answer.

______________________________________________________________________________

************END*******

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy