0% found this document useful (0 votes)
78 views

Deep Learning - IIT Ropar - Unit 14 - Week 11

Uploaded by

vmcse09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views

Deep Learning - IIT Ropar - Unit 14 - Week 11

Uploaded by

vmcse09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 14 - Week 11

(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)

vmcse09@gmail.com 

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Deep Learning - IIT Ropar (course)

Course Week 11 : Assignment 11


outline The due date for submitting this assignment has passed.
Due on 2024-10-09, 23:59 IST.
About
NPTEL ()
Assignment submitted on 2024-09-28, 01:58 IST
How does an 1) For which of the following problems are RNNs suitable? 1 point
NPTEL
online Generating a description from a given image
course Forecasting the weather for the next N days based on historical weather data
work? ()
Converting a speech waveform into text

Week 1 () Identifying all objects in a given image


Partially Correct.
Week 2 () Score: 0.67
Accepted Answers:
Generating a description from a given image
Week 3 ()
Forecasting the weather for the next N days based on historical weather data
Converting a speech waveform into text
week 4 ()

Week 5 () 2) What is the basic concept of Recurrent Neural Network? 1 point

Use a loop between inputs and outputs in order to achieve the better prediction
Week 6 ()
Use recurrent features from dataset to find the best answers

Week 7 () Use loops between the most important features to predict next output
Use previous inputs to find the next output according to the training set
Week 8 ()
Yes, the answer is correct.
Score: 1
Week 9 () Accepted Answers:
Use previous inputs to find the next output according to the training set
week 10 ()
3) Select the true statements about BPTT? 1 point

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=150&assessment=299 1/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 14 - Week 11

Week 11 () The gradients of Loss with respect to parameters are added across time steps
The gradients of Loss with respect to parameters are subtracted across time steps
Sequence
The gradient may vanish or explode, in general, if timesteps are too large
Learning
Problems The gradient may vanish or explode if timesteps are too small
(unit? Yes, the answer is correct.
unit=150&less Score: 1
on=151) Accepted Answers:
The gradients of Loss with respect to parameters are added across time steps
Recurrent
Neural
The gradient may vanish or explode, in general, if timesteps are too large
Networks
4) What is the main advantage of using GRUs over traditional RNNs? 1 point
(unit?
unit=150&less They are simpler to implement
on=152)
They solve the vanishing gradient problem
Backpropagati
They require less computational power
on through
time (unit? They can handle non-sequential data
unit=150&less Yes, the answer is correct.
on=153) Score: 1
Accepted Answers:
The problem
They solve the vanishing gradient problem
of Exploding
and Vanishing
Gradients 5) We construct an RNN for the sentiment classification of text where a text can have positive
(unit? sentiment or negative sentiment. Suppose the dimension of one-hot encoded-words is R100×1 ,
unit=150&less dimension of state vector si is R50×1 . What is the total number of parameters in the network?
on=154) (Don’t include biases also in the network)
Some Gory
7500
Details (unit?
unit=150&less No, the answer is incorrect.
on=155) Score: 0
Accepted Answers:
Selective
(Type: Range) 7599.5,7601.5
Read,
Selective 1 point
Write,
Selective 6) Arrange the following sequence in the order they are performed by LSTM at time 1 point
Forget - The step t.
Whiteboard [Selectively read, Selectively write, Selectively forget]
Analogy (unit?
unit=150&less Selectively read, Selectively write, Selectively forget
on=156)
Selectively write, Selectively read, Selectively forget
Long Short Selectively read, Selectively forget, Selectively write
Term
Selectively forget, Selectively write, Selectively read
Memory(LSTM
) and Gated No, the answer is incorrect.
Recurrent Score: 0
Units(GRUs) Accepted Answers:
(unit? Selectively read, Selectively forget, Selectively write
unit=150&less
on=157) 7) What are the problems in the RNN architecture? 1 point

How LSTMs
Morphing of information stored at each time step.
avoid the
problem of Exploding and Vanishing gradient problem.

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=150&assessment=299 2/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 14 - Week 11

vanishing
gradients Errors caused at time step tn can’t be related to previous time steps faraway
(unit?
All of the above
unit=150&less
on=158) Yes, the answer is correct.
Score: 1
How LSTMs Accepted Answers:
avoid the All of the above
problem of
vanishing 8) We are given an RNN where max eigenvalue λ of Weight matrix is 0.9. The 1 point
gradients ∂ s20
activation function used in the RNN is logistic/sigmoid. What can we say about ∇ = ∥ ∥ ?
(Contd.) (unit? ∂ s1
unit=150&less
on=159)
Value of ∇ is close to 0.
Lecture
Material for Value of ∇ is very high.
Week 11 (unit?
unit=150&less Value of ∇ is 3.5.
on=160)
Insufficient information to say anything.
Week 11
Yes, the answer is correct.
Feedback
Score: 1
Form: Deep
Accepted Answers:
Learning - IIT
Value of ∇ is close to 0.
Ropar (unit?
unit=150&less
on=194) 9) What is the objective(loss) function in the RNN? 1 point

Quiz: Week Cross Entropy


11 :
Sum of cross-entropy
Assignment
11 Squared error
(assessment? Accuracy
name=299)
No, the answer is incorrect.
Score: 0
Week 12 ()
Accepted Answers:
Sum of cross-entropy
Download
Videos ()
10) Which of the following techniques can be used to address the exploding gradient 1 point
problem in RNNs?
Books ()
Gradient clipping
Text
Dropout
Transcripts
() L1 regularization
L2 regularization
Problem Yes, the answer is correct.
Solving Score: 1
Session - Accepted Answers:
July 2024 () Gradient clipping

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=150&assessment=299 3/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 14 - Week 11

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=150&assessment=299 4/4

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy