Supervised learningNN
Supervised learningNN
Supervised learningNN
NETWORK
x1 w1
w0
w2
x2 o
.
n
. wi xi
. wn i=0
n
xn 1 if wi xi >0
f(xi)= { i=0
-1 otherwise
wi = wi + wi
wi = (t - o) xi
where
t = c(x) is the target value,
o is the perceptron output,
is a small constant (e.g., 0.1) called learning rate.
Error: The error value is the amount by which the value output by
the network differs from the target value. For example, if we
required the network to output 0 and it outputs 1, then Error = -1.
Use a set of sample patterns where the desired output (given the
inputs presented) is known.
Output Values
Output Layer
Adjustable
Weights
Input Layer
Input Signals (External Stimuli)
Adaline network uses Delta Learning Rule. This rule is also called as
Widrow Learning Rule or Least Mean Square Rule. The delta rule for
adjusting the weights is given as (i = 1 to n):
Training
• Feed-in known inputs in random sequence
• Simulate the network
Training • Compute error between the input and the
output (Error Function)
• Adjust weights (Learning Function)
• Repeat until total error < ε
Thinking Thinking
• Simulate the network
• Network will respond to any input
• Does not guarantee a correct solution even
for trained inputs
• The initial weights are taken to be w1=w2=b=0.1 and learning rate also as
0.1.
• For the first input sample, x1=1, x2=1, t=1, we calculate the net input as:
• Summing up all the errors obtained for each input sample during one
epoch will give the total mean square error of the epoch.
Inputs
Hiddens
I0
Outputs
h0
I1 o0
h1
I2 o1
h2 Outputs
I3 Hiddens
Inputs
7/30/2020 ITE 1015 51
MULTILAYER FEEDFORWARD NETWORK:
ACTIVATION AND TRAINING
For feed forward networks:
• A continuous function can be differentiated allowing gradient-
descent.
• Back propagation is an example of a gradient-descent technique.
• Uses sigmoid (binary or bipolar) activation function.
Will find a local, not necessarily global error minimum -in practice
often works well (can be invoked multiple times with different initial
weights)
Image processing.
Signature verification.
Bioinformatics.
Perceptron,
Adaline,
Madaline,
Backpropagation Network,
Radial Basis Function Network.
Apart from these mentioned above, there are several other supervised
neural networks like tree neural networks, wavelet neural network,
functional link neural network and so on.
Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.
Alternative Proxies: