0% found this document useful (0 votes)
10 views

Lecture_8.1.1 (1)

Uploaded by

mr.jhion.adbar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Lecture_8.1.1 (1)

Uploaded by

mr.jhion.adbar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Forward Propagation

Definition: Forward propagation is the process of passing inputs through the network to obtain
an output. This involves calculating the weighted sum of inputs, applying activation functions at
each layer, and finally producing a prediction or output from the network.

Steps in Forward Propagation:

1. Input Layer to Hidden Layer:


○ Take the input data and multiply it by the weights of the connections to the hidden
layer neurons.
○ Add any biases associated with the hidden layer neurons.
○ Apply an activation function (e.g., ReLU, sigmoid) to the result to introduce
non-linearity, making the network capable of learning complex patterns.
2. Hidden Layers to Output Layer:
○ The output from each hidden layer neuron is passed forward to the output layer,
following the same process of weighted summation and activation.
○ The final result is the network’s output or prediction.

Purpose of Forward Propagation:

● In forward propagation, the network is essentially making a prediction based on the


current weights and biases. The prediction is then compared with the actual target to
calculate the error (loss), which is used in backpropagation to adjust the weights and
biases.

2. Backpropagation

Definition: Backpropagation (short for “backward propagation of errors”) is the process of


adjusting the network's weights and biases in order to minimize the error between the predicted
output and the actual target. This is done by propagating the error backward through the
network, from the output layer to the input layer.

Steps in Backpropagation:

1. Calculate the Error (Loss):


○ After forward propagation, calculate the loss or error (e.g., using Mean Squared
Error for regression, Cross-Entropy Loss for classification) based on the
difference between the network's prediction and the actual target.
2. Compute Gradients:
○ Use calculus (specifically, partial derivatives) to calculate the gradient of the loss
with respect to each weight and bias in the network. This shows how much each
parameter affects the error.
3. Update Weights and Biases:
○ Using a process called gradient descent, adjust each weight and bias in the
network by moving in the opposite direction of the gradient to reduce the error.
The step size of these updates is controlled by a parameter called the learning
rate.
4. Propagation Through Layers:
○ Backpropagation proceeds layer by layer, calculating gradients for each layer
starting from the output layer and moving backward through each hidden layer to
the input layer. This is why it’s called “backpropagation.”

Purpose of Backpropagation:

● Backpropagation enables the network to learn from its mistakes by updating weights and
biases, which reduces the overall error. This is the process through which the network
“learns” from the data and improves its accuracy over multiple training iterations
(epochs).

Why Forward and Backpropagation Matter

● Forward propagation is how the network makes predictions based on current


parameters.
● Backpropagation is how the network learns from errors in its predictions to improve
those predictions over time.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy