COMP3308/3608 Artificial Intelligence Week 9 Tutorial Exercises Multilayer Neural Networks 2. Deep Learning
COMP3308/3608 Artificial Intelligence Week 9 Tutorial Exercises Multilayer Neural Networks 2. Deep Learning
COMP3308/3608 Artificial Intelligence Week 9 Tutorial Exercises Multilayer Neural Networks 2. Deep Learning
is 1-a2.
b) Write the weight change rule for this function, using the result from a). See slide 26 from the lecture on
backpropagation.
w ji =
wkj =
Exercise 2.
a) A 2-layer feed-forward neural network with 10 input units, 5 hidden units and 3 output units contains
how many weights? (Include biases.) Show your work.
c) Is the backpropagation algorithm guaranteed to achieve 100% correct classification for any linearly-
separable set of training examples, given a sufficiently small learning rate? Explain briefly.
Exercise 3. Cybenko’s theorem (slide 39) states that any continuous function can be approximated by a
backpropagation network with 1 hidden layer. Why do we use networks with more than 1 hidden layer?
Which classifier was more accurate? Which one was faster to train?
Try different starting positions and different learning rates. What is the influence of the learning rate? What
happens if the learning rate is too big or too small?
In Matlab, select “Momentum Backpropagation” from Demos -> Toolboxes -> Neural Networks (upper
left window)
Exercise 7. Generalization
In Matlab, type nnd11gn or select “generalization” from Demos -> Toolboxes -> Neural Networks (upper
left window)
Try a simple function (i.e. choose a small difficulty index) with too many hidden neurons. What is the
reason for the poor generalization?
Follow the convolution example on this website to see how the convolved features are computed:
http://deeplearning.stanford.edu/tutorial/supervised/FeatureExtractionUsingConvolution/
Follow the backpropagation walkthrough example prepared by Josh Stretton and available from Canvas; it
is an extended version of the example from the first lecture on slides 33-37. The goal is to understand the
forward and backward passes of the backpropagation algorithm and how the weights are updated.