AD3501 Deep Learning Course Plan
AD3501 Deep Learning Course Plan
Definition:
A computer network is a set of devices connected through links. A node can be computer, printer, or
any other device capable of sending
or receiving the data. The links connecting the nodes are known as communication
channels.Computer Network uses distributed processing
in which task is divided among several computers. Instead, a single computer handles an entire task,
each separate computer handles a subset.
OBJECTIVES:
To understand and need and principles of deep neural networks
To understand CNN and RNN architectures of deep neural networks.
To comprehend advanced deep learning models.
To learn the evaluation metrics for deep learning models.
OUTCOMES:
Upon completion of this course, the students will be able to
CO 1: Explain the basics in deep neural networks.
CO 2: Apply Convolution Neural Network for image processing.
CO 3: Apply Recurrent Neural Network and its variants for text analysis.
CO 4: Apply model evaluation for various applications.
CO 5: Apply autoencoders and generative models for suitable applications
SYLLABUS
AD3501 – Deep Learning
`Linear Algebra: Scalars – Vectors – Matrices and tensors; Probability Distributions – Gradient based
Optimization – Machine Learning Basics: Capacity – Overfitting and underfitting – Hyperparameters
and validation sets – Estimators – Bias and variance – Stochastic gradient descent – Challenges
motivating deep learning; Deep Networks: Deep feedforward networks; Regularization –
Optimization.
Unfolding Graphs – RNN Design Patterns: Acceptor – Encoder – Transducer; Gradient Computation
– Sequence Modeling Conditioned on Contexts – Bidirectional RNN – Sequence to Sequence RNN –
Deep Recurrent Networks – Recursive Neural Networks – Long Term Dependencies; Leaky Units:
Skip connections and dropouts; Gated Architecture: LSTM.
45 PERIODS
TEXT BOOKS:
1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, ``Deep Learning'', MIT Press, 2016.
2. Andrew Glassner, “Deep Learning: A Visual Approach”, No Starch Press, 2021
REFERENCES:
1 Salman Khan, Hossein Rahmani, Syed Afaq Ali Shah, Mohammed Bennamoun, ``A Guide to
Convolutional Neural Networks for Computer Vision'', Synthesis Lectures on Computer Vision,
Morgan & Claypool publishers, 2018.
2. Yoav Goldberg, ``Neural Network Methods for Natural Language Processing'', Synthesis Lectures
on Human Language Technologies, Morgan & Claypool publishers, 2017.
3. Francois Chollet, ``Deep Learning with Python'', Manning Publications Co, 2018.
4. Charu C. Aggarwal, ``Neural Networks and Deep Learning: A Textbook'', Springer International
Publishing, 2018..
Online Resources
OS1: https://www.youtube.com/watch?v=VyWAvY2CF9c
OS2: https://www.youtube.com/watch?v=i_LwzRVP7bg
OS3: https://www.youtube.com/watch?v=JhQqquVeCE0
OS4:
OS5:
OS6:
OS7:
OS8:
OS9:
OS10 :
Use of Teaching
S. Topic to be Proposed Actual Cognitive Reference
teaching Methodolog
No Covered Date Date level Material
Tool y
Unit I- DEEP NETWORKS BASICS
Linear Algebra: Chalk
1 Scalars -- Vectors -- Understand and Lecture T1
Matrices and tensors Board
Chalk Lecture
Probability
2 Understand and T1
Distributions Board
Gradient-based Chalk Lecture
Optimization – and
3 Remember Board T1,T2
Machine Learning
Basics
Capacity – Over Chalk Lecture
4 fitting and under , Understand and T1
fitting Board
Chalk Lecture
-Hyper parameters
5 Understand and T1
and validation sets Board
Estimators -- Bias
Visual
and variance -- Demonstra Online
6 Understand Aids/
Stochastic gradient tion Source
PPT
descent
Challenges
Understand Visual Demonstra Online
7 motivating deep Aids tion Source
learning
Deep Networks: Chalk Lecture
8 Deep feed forward Rememberand T1
networks Board
Chalk Lecture
Regularization --
9 Understand and T1
Optimization. Board
Unit II CONVOLUTIONAL NEURAL NETWORKS
Convolution Chalk
11 Operation -- Sparse Understand and Lecture T1
Interactions Board
Parameter Sharing -- Visual
Demonstra Online
12 Equivariance Understand Aids/
tion Source
PPT
Pooling -- Chalk
13 Convolution Understand and Lecture T1
Variants: Strided Board
Tiled -- Transposed Understand Chalk
14 and dilated and Lecture T1
convolutions Board
CNN Learning: Understand Chalk
15 Nonlinearity and Lecture T1
Functions Board
Loss Functions Understand Visual Demonstra Online
16 Aids tion Source
17 - Regularization -- Understand Chalk Lecture T1
Optimizers and
Board
Gradient Understand Chalk Lecture
18 Computation and T1,T2
Board
Unit III RECURRENT NEURAL NETWORKS
Unfolding Graphs -- Understand
Visual Demonstra Online
19 RNN Design Aids tion Source
Patterns
Acceptor -- Encoder Understand Flipped Online
20 Class Source
--Transducer
Gradient
Computation -- Visual
Demonstra Online
21 Sequence Modeling Apply Aids/
tion Source
Conditioned on PPT
Contexts
- Bidirectional RNN Chalk
22 -- Sequence to Understand and Lecture T1
Sequence RNN Board
Deep Recurrent Chalk
23 Networks Understand and Lecture T1
Board
- Recursive Neural Chalk Lecture
24 Networks Understand and T1,R2
Board
- Long Term
25 Dependencies; Understand Fish Bowl
Leaky Units
Skip connections and In lab Demonstra
26 Apply tion
dropouts
Gated Architecture: In lab Demonstra
27 Apply tion
LSTM