0% found this document useful (0 votes)
325 views17 pages

Hopfield Network: Notes by Dr. B. Anuradha

The document discusses Hopfield networks, a type of recurrent neural network. It notes that Hopfield networks were formulated in 1982 by John Hopfield to emulate the associative characteristics of human memory. The key aspects are that Hopfield networks use feedback loops to stabilize into states that represent stored patterns, even when presented with incomplete or corrupted inputs. They gradually converge to the closest matching stored pattern through an iterative process analogous to a ball rolling into the nearest valley basin. This allows Hopfield networks to function as content-addressable memory.

Uploaded by

Vinisha Kolla
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
325 views17 pages

Hopfield Network: Notes by Dr. B. Anuradha

The document discusses Hopfield networks, a type of recurrent neural network. It notes that Hopfield networks were formulated in 1982 by John Hopfield to emulate the associative characteristics of human memory. The key aspects are that Hopfield networks use feedback loops to stabilize into states that represent stored patterns, even when presented with incomplete or corrupted inputs. They gradually converge to the closest matching stored pattern through an iterative process analogous to a ball rolling into the nearest valley basin. This allows Hopfield networks to function as content-addressable memory.

Uploaded by

Vinisha Kolla
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 17

Hopfield Network

Notes
by
Dr. B. Anuradha

12/08/21 1
12/08/21 2
12/08/21 3
• Neural networks were designed in analogy with the
brain. The brain’s memory, however, works by
association. For example, we can recognise a familiar
face even in an unfamiliar environment within 100-
200 ms. The brain routinely associates one thing with
another.
• To emulate the human memory’s associative
characteristics, we need a different type of network: a
recurrent neural network.

• A recurrent neural network has feedback loops from its


outputs to its inputs.
12/08/21 4
• The stability of recurrent networks intrigued
several researchers in the 1960s and 1970s.
However, none was able to predict which
network would be stable, and some researchers
were pessimistic about finding a solution at all.

• The problem was solved only in 1982, when


John Hopfield formulated the principle of
storing information in a dynamically stable
network.

12/08/21 5
12/08/21 6
12/08/21 7
o We have a ball rolling down a valley
o Bottom of the valley represents the pattern stored in Hopfield net
o Wherever the ball is initially placed, it will roll towards the
nearest local minimum – this represents the Hopfield net iteratively
processing the next network state
o Ball will eventually stop rolling at the bottom of the valley –
this represents the stable state of the Hopfield network
12/08/21 8
Basin of Attraction and Stable States

12/08/21 9
12/08/21 10
Given some partial pattern, Hopfield network will eventually
stabilise at the closest matching pattern

12/08/21 11
initial
step 1

-2

12/08/21 12
step 1 step 2 step 3
initial
-2

+3
+3

step 4 step 5 step 6

+3

-1
-3

12/08/21 13
12/08/21 14
12/08/21 15
12/08/21 16
12/08/21 17

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy