0% found this document useful (0 votes)
65 views13 pages

ANN Matlab

The document discusses artificial neural networks in Matlab. It covers the architecture of single neurons and networks with multiple layers. It provides examples of implementing AND, OR, NAND, and NOR logic gates using perceptrons in Matlab and shows the initial and final weights and biases. It also introduces backpropagation and linear filters for time series data that can be implemented in Matlab neural networks.

Uploaded by

Sayem Hasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPS, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views13 pages

ANN Matlab

The document discusses artificial neural networks in Matlab. It covers the architecture of single neurons and networks with multiple layers. It provides examples of implementing AND, OR, NAND, and NOR logic gates using perceptrons in Matlab and shows the initial and final weights and biases. It also introduces backpropagation and linear filters for time series data that can be implemented in Matlab neural networks.

Uploaded by

Sayem Hasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPS, PDF, TXT or read online on Scribd
You are on page 1/ 13

Artificial Neural Network

in Matlab
Hany Ferdinando
Architecture (single neuron)

w is weight matrices, dimension 1xR


p is input vector, dimension Rx1
b is bias

a = f(Wp + b)
Neural Network in Matlab 2
Transfer Function

Neural Network in Matlab 3


Architecture with neurons

w is weight matrices, dimension SxR


p is input vector, dimension Rxn
Neural Network in Matlab 4
b is bias
Multiple layers

Neural Network in Matlab 5


Perceptrons in Matlab

Make the perceptrons with net = newp(PR,S,TF,LF)


PR = Rx2 matrix of min and max values for R input elements
S = number of output vector
TF = Transfer function, default = ‘hardlim’, other option = ‘hardlims’
LF = Learning function, default = ‘learnp’, other option = ‘learnpn’

hardlim = hardlimit function


hardlims = symetric hardlimit function

learnpw = (t-a)pT = epT


learnpn  normalized learnp

Wnew = Wold + W b new =b old +e where e = t - a


Neural Network in Matlab 6
Compute manually…
 This is an exercise how to run the artificial
neural network
 From the next problem, we will compute
the weights and biases manually

Neural Network in Matlab 7


AND Gate in Perceptron
Performance is 0, Goal is 0
1
P = [0 0 1 1; 0 1 0 1];
0.9
T = [0 0 0 1];
0.8

net = newp([0 1; 0 1],1); 0.7

Training-Blue Goal-Black
weight_init = net.IW{1,1}
0.6
bias_init = net.b{1}
0.5

net.trainParam.epochs = 20; 0.4

net = train(net,P,T); 0.3


weight_final = net.IW{1,1} 0.2
bias_final = net.b{1}
0.1
simulation = sim(net,P)
0
0 1 2 3 4 5 6
6 Epochs

weight_init = [0 0], bias_init = 0


weight_final = [2 1], bias_final = -3
Neural Network in Matlab 8
OR Gate in Perceptron
Performance is 0, Goal is 0
1
P = [0 0 1 1; 0 1 0 1];
T = [0 1 1 1]; 0.9

0.8
net = newp([0 1; 0 1],1);
0.7

Training-Blue Goal-Black
weight_init = net.IW{1,1}
bias_init = net.b{1} 0.6

0.5
net.trainParam.epochs = 20; 0.4
net = train(net,P,T);
0.3
weight_final = net.IW{1,1}
bias_final = net.b{1} 0.2

simulation = sim(net,P) 0.1

0
0 0.5 1 1.5 2 2.5 3 3.5 4
4 Epochs

weight_init = [0 0], bias_init = 0


weight_final = [1 1], bias_final = -1
Neural Network in Matlab 9
NAND Gate in Perceptron
Performance is 0, Goal is 0
1
P = [0 0 1 1; 0 1 0 1];
T = [1 1 1 0]; 0.9

0.8
net = newp([0 1; 0 1],1);
0.7

Training-Blue Goal-Black
weight_init = net.IW{1,1}
0.6
bias_init = net.b{1}
0.5
net.trainParam.epochs = 20; 0.4
net = train(net,P,T);
0.3
weight_final = net.IW{1,1}
bias_final = net.b{1} 0.2

simulation = sim(net,P) 0.1

0
0 1 2 3 4 5 6
6 Epochs

weight_init = [0 0], bias_init = 0


weight_final = [-2 -1], bias_final = 2
Neural Network in Matlab 10
NOR Gate in Perceptron
Performance is 0, Goal is 0
1
P = [0 0 1 1; 0 1 0 1];
T = [1 0 0 0]; 0.9

0.8
net = newp([0 1; 0 1],1);
0.7

Training-Blue Goal-Black
weight_init = net.IW{1,1}
bias_init = net.b{1} 0.6

0.5
net.trainParam.epochs = 20;
0.4
net = train(net,P,T);
weight_final = net.IW{1,1} 0.3

bias_final = net.b{1} 0.2


simulation = sim(net,P) 0.1

0
0 0.5 1 1.5 2 2.5 3 3.5 4
4 Epochs

weight_init = [0 0], bias_init = 0


weight_final = [-1 -1], bias_final = 0
Neural Network in Matlab 11
Backpropagation in Matlab

Make the backpropagation with


net = newff(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF)

PR = Rx2 matrix of min and max values for R input elements


S = number of output vector
BTF = Transfer function (user can use any transfer functions)
BLF = Learning function
PF = performance

xk+1 = xk - kgk

Neural Network in Matlab 12


Linear Filter (with ANN) in Matlab

Make the Linear Filter with newlin(PR,S,ID,LR)


PR = Rx2 matrix of min and max values for R input elements
S = number of output vector
ID = delay
LR = Learning Rate
Transfer function for linear filter is only linear line (purelin)

Neural Network in Matlab 13

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy