0% found this document useful (0 votes)
50 views

Chapter 7: Supervised Hebbian Learning: Brandon Morgan 1/13/2021

This document discusses using the pseudoinverse rule to solve issues with simple Hebb's rule in supervised Hebbian learning. It shows that simple Hebb's rule does not guarantee the output vectors will match the input vectors if they are not orthogonal. The pseudoinverse rule computes the weights W using the Moore-Penrose pseudoinverse of the input matrix P. This allows the output to correctly match the target input vector p2, even when p1 and p2 are not orthogonal. The error is measured using Hamming distance, which is 1 for p2 and 2 for p1, correctly classifying the input.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views

Chapter 7: Supervised Hebbian Learning: Brandon Morgan 1/13/2021

This document discusses using the pseudoinverse rule to solve issues with simple Hebb's rule in supervised Hebbian learning. It shows that simple Hebb's rule does not guarantee the output vectors will match the input vectors if they are not orthogonal. The pseudoinverse rule computes the weights W using the Moore-Penrose pseudoinverse of the input matrix P. This allows the output to correctly match the target input vector p2, even when p1 and p2 are not orthogonal. The error is measured using Hamming distance, which is 1 for p2 and 2 for p1, correctly classifying the input.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Chapter 7: Supervised Hebbian Learning

Brandon Morgan
1/13/2021

E7.2
In problem E7.1, our output matched input p2 , despite not having orthogonal input vectors. Simple Hebb’s
rule does not gurantee that the output vectors will match an input vector if they are not orthogonal. In
response, we can fix this problem by using the Pseudoinverse Rule.
In simple Hebb’s rule, W = T P , but by using the Pseudoinverse Rule we will instead compute W = T P + ,
where P + is the Moore-Penrose pseudoinverse given by P + = (P t P )−1 P t for when the number of rows of P
is greater than the number of columns of P:
Before, our P matrix was given by:
 
−1 1
−1 1
P = 
1 −1
1 1

Now, our P + is calculated to be:

p1 = matrix(c(-1, -1, 1, 1), ncol=1)


p2 = matrix(c(1, 1, -1, 1), ncol=1)
P = matrix(c(p1, p2), ncol=2)
P

## [,1] [,2]
## [1,] -1 1
## [2,] -1 1
## [3,] 1 -1
## [4,] 1 1

pseudo = solve(t(P)%*%P)%*%t(P)
pseudo

## [,1] [,2] [,3] [,4]


## [1,] -0.1666667 -0.1666667 0.1666667 0.5
## [2,] 0.1666667 0.1666667 -0.1666667 0.5

Now, our weights are calculated by W = T P + :

W=P%*%pseudo
W

## [,1] [,2] [,3] [,4]


## [1,] 0.3333333 0.3333333 -0.3333333 0
## [2,] 0.3333333 0.3333333 -0.3333333 0
## [3,] -0.3333333 -0.3333333 0.3333333 0
## [4,] 0.0000000 0.0000000 0.0000000 1

1
Now, we can test our new input pattern ptt = [1, 1, 1, 1]:

pt = matrix(c(1, 1, 1, 1), ncol=1)


W%*%pt

## [,1]
## [1,] 0.3333333
## [2,] 0.3333333
## [3,] -0.3333333
## [4,] 1.0000000

The hardlims function states that every entry of a = (W pt )i < 0 is assigned to -1 and a = (W pt )i ≥ 0 is
assigned to 1. Thus, our output vector would be a = [1, 1, −1, 1] = p2 , which is the same as predicted in
E7.1 .
The error can be measured by Hamming distance (Ch. 3). This is calculated by taking the difference or
adding, depending upon if hardlims or hardlim was used, of the output and the input vectors. In our case,
pt has a Hamming distance of 1 from p2 and 2 from p1 ; therefore, our perceptron correctly predicted the
classification.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy