Backpropagation Algorithm
Backpropagation Algorithm
Key Points
● One of the most important developments in neural
networks.
● Provides a systematic procedure for updating
weights in a Backpropagation Network (BPN).
● Uses the gradient descent method for weight
updates.
● The error is propagated backward into the hidden
layers.
Stages of Backpropagation
. Initialization
Initial weights are applied to all the neurons.
. Forward Propagation
Inputs from a training set are passed through the
neural network, and an output is computed.
. Error Function
An error function is defined, which captures the
difference between the correct output and the
model's output, based on the current weights.
. Backpropagation
The objective is to update the weights to minimize
the error function. This is done using the gradient of
the errorwith respect to each weight.
Advantages of Backpropagation
. Fast, simple, and easy to implement
. No parameters to tune apart from the number of
input features
. Flexible—doesn’t require prior knowledge of the
function
. Works well as a standard method
. Doesn’t require special mention of the function
features to be learned
Disadvantages of Backpropagation
● Performance depends on the quality of input data
● Sensitive to noisy data
Numerical Example:
●
Forward pass
Backward pass
Neta is learning rate
Calculate more weights and keep repeating steps..