Format

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 34

Amity School of Engineering & Technology

Cognitive modeling is an area of computer science that deals with simulating


human problem-solving and mental processing in a computerized model. Such
a model can be used to simulate or predict human behavior or performance on
tasks similar to the ones modeled and improve human-computer interaction.

Cognitive modeling is used in numerous artificial intelligence (AI)


applications, such as expert systems, natural language processing, 
neural networks, and in robotics and virtual reality applications. Cognitive
models are also used to improve products in manufacturing segments,
such as human factors, engineering, and computer game and user
interface design.

1
Amity School of Engineering & Technology

• Cognitive science studies how people make their 


ideas and what makes thoughts logical. It is often seen
as the result of several different scientific fields working
together. These fields are psychology (a study of the
mind), neuroscience (biological study of the brain),
computer science (the creation of programs and
computers), and linguistics (the study of language). It
does not refer to the sum of all these disciplines. It
refers to their intersection on specific problems.

2
Amity School of Engineering & Technology

• When classifying long term memories, we


end up with two major groups. One
is Declarative or Explicit Memory and Non-
Declarative or Implicit Memory. We will be
discussing a detail about the declarative
memory in this article. This class is further
subdivided into Semantic Memory and
Episodic Memory. 

3
Amity School of Engineering & Technology

Semantic memory
The semantic memory focuses much on the factual and conceptual knowledge about the
world and the way it expressed in terms of words. So basically, it supports the ability to
interact in terms of language. This includes knowledge about the language and
conceptual information. Rather general knowledge also counts in the same. 

4
Amity School of Engineering & Technology

Episodic memory
• The episodic memory focuses on one’s life events
that the person has experienced throughout the
phases of his life. These are the memories which get
stores in one’s limbic system. This is going to involve
the memory from one’s perspective but will surely
not account for evident facts and figures. It too
involves the two major components about the
event which are when did the event occur and
where? 

5
Amity School of Engineering & Technology

• Examples of Semantic Memory


• While eating an apple, you recognize Apple as fruit and from
your knowledge, can confer its importance. 
• When listening to the birds chirping near the window, you
straight away point out the bird to be the sparrow. 
• The calculation of the month’s grocery budget through
simple additional methods. 
• You are scheming and planning to eat your favourite cuisine
at your favourite Chinese Restaurant and to pay on spot the
charges for what you ate.
• Introducing yourself with the known qualities a good person
may possess. 

6
Amity School of Engineering & Technology

• Examples of Episodic Memory 


• The memory you had with your squad over
the Friend’s Wedding. 
• The memory of what you ate in breakfast this
morning 
• It can be an unforgettable tragic memory
which you had while an accident. 
• The failed one-to-one session at an interview
recently. 

7
Amity School of Engineering & Technology

• How episodic and semantic memory may work in integration? 


• Conjugating the certain episode in terms of the period can equally be
explained through this phenomenon. Like, some days you are not sure
about the day you are in. But you may confess, that it is your routine to
have a check-up on Monday, so it might be Monday this day. Or it can be
like you may fail to recognize one thing, but integration in terms of time
and space will help you recall. 
• Though, episodic memories particularly about the events when and how
it happened but it must not involve remembering the experience. Like
you remember being born on 15th September in London, but you don’t
remember the overall experience. 

8
Amity School of Engineering & Technology

• 3Rs associated with memory-making 


• Sometimes it happens, people fail to make
memories. This means that they aren’t able to
recognize and recall things even which took
place a few seconds earlier. It is mostly seen
in the rare case of Herpes Encephalitis, a viral
infection. 

9
Amity School of Engineering & Technology

• Few of the memories store differently and


gets into you through automated work
processes you go through. Memory is defined
to be the learning that has become persistent
over time, stored and can be recalled. To
access your memory, you need to consider
3Rs. These are recall, recognition and
relearning. 

10
Amity School of Engineering & Technology

• The recall is the retrieval of the memory a person


must have learnt earlier. Like, recalling Mango as
the king of all fruits. The recognition is like
identifying all relatable information and eliminating
the odd one. Like, from a list of mango, orange,
jasmine and banana you’ll surely be excluding
jasmine. The relearning is reinforcing information
you have been learning all the way. Like learning the
mathematics formulae and then revising it. All work
through synaptic connections within our brains. 

11
Amity School of Engineering & Technology

• The memory formation is broken down into three main stages: 


• Sensory Memory encoded into the brain. The immediate things which
we want to record are taken up as sensory input and then shuffle it into
short term memory. 
• Short Term Memory form. The incorporated memory stays there just for
30 seconds without rehearsals. Your mind can’t remember beyond 7 bits
of information at a time. 
• Long term memory encoding. This is the durable storage compartment
of your brain where the memory tends to stay for long. The way the
short-term memory is calculated to be in the long-term memory is
through the principle of ‘Working Memory’. It involves all the ways
through which profound cognition is achieved through auditory
rehearsals and executing visual-spatial information. 

12
Amity School of Engineering & Technology

The dynamical systems approach to cognition


is the theoretical framework within which this embodied view of cognition
can be formalized. This chapter reviews the core concepts of the dynamical
systems approach and illustrates them through a set of experimentally
accessible examples. Particular attention is given to how cognition can be
understood in terms that are compatible with principles of neural function,
most prominently, with the space-time continuity of neural processes. The
chapter reviews efforts to form concepts based on the mathematical
theory of dynamical systems into a rigorous scientific approach toward
cognition that embraces the embodied and situated stance. The chapter
explains how behavioral signatures of the neural field dynamics may
provide evidence for the Dynamical Field Theory (DFT) account of
cognition. The theoretical concept of stability, at the core of dynamical
systems thinking, is the key to understanding autonomy.

13
Amity School of Engineering & Technology

• The term episodic memory refers to the ability to recall previously experienced
events and to recognize things as having been encountered previously. Research
on the neural basis of episodic memory has increasingly come to focus on three
structures: The hippocampus, Perirhinal cortex and Prefrontal cortex. This
chapter reviews the Complementary Learning Systems (CLS) model and how it
has been applied to understanding hippocampal and neocortical contributions
to episodic memory. In addition to the biologically based models, there is a rich
tradition of researchers building more abstract computational models of
episodic memory. The chapter describes an abstract modeling framework, the
Temporal Context Model (TCM) that has proved to be very useful in
understanding how to selectively retrieve memories from a particular temporal
context in free recall experiments. Episodic memory modeling has a long
tradition of trying to build comprehensive models that can simultaneously
account for multiple recall and recognition findings.

14
Amity School of Engineering & Technology

• Connectionist models, also known as Parallel Distributed


Processing (PDP) models, are a class of computational
models often used to model aspects of human
perception, cognition, and behaviour, the learning
processes underlying such behaviour, and the storage
and retrieval of information from memory. The approach
embodies a particular perspective in cognitive science,
one that is based on the idea that our understanding of
behaviour and of mental states should be informed and
constrained by our knowledge of the neural processes
that underpin cognition

15
Amity School of Engineering & Technology

What is BFS?

BFS stands for Breadth First Search. It is also known as level order traversal. The
Queue data structure is used for the Breadth First Search traversal. When we use the
BFS algorithm for the traversal in a graph, we can consider any node as a root node.

16
Amity School of Engineering & Technology

Suppose we consider node 0 as a root node. Therefore, the traversing would be


started from node 0

Once node 0 is removed from the Queue, it gets printed and marked as a visited node.
Once node 0 gets removed from the Queue, then the adjacent nodes of node 0 would be
inserted in a Queue as shown below:

17
Amity School of Engineering & Technology

Now the node 1 will be removed from the Queue; it gets printed and marked as a
visited node
Once node 1 gets removed from the Queue, then all the adjacent nodes of a node 1
will be added in a Queue. The adjacent nodes of node 1 are 0, 3, 2, 6, and 5. But we
have to insert only unvisited nodes in a Queue. Since nodes 3, 2, 6, and 5 are
unvisited; therefore, these nodes will be added in a Queue as shown below:

The next node is 3 in a Queue. So, node 3 will be removed from the Queue,
it gets printed and marked as visited as shown below:

18
Amity School of Engineering & Technology

Once node 3 gets removed from the Queue, then all the adjacent nodes of node 3 except the
visited nodes will be added in a Queue. The adjacent nodes of node 3 are 0, 1, 2, and 4. Since
nodes 0, 1 are already visited, and node 2 is present in a Queue; therefore, we need to insert
only node 4 in a Queue.

19
Amity School of Engineering & Technology

Now, the next node in the Queue is 2. So, 2 would be deleted from
the Queue. It gets printed and marked as visited as shown below:

Once node 2 gets removed from the Queue, then all the adjacent nodes of
node 2 except the visited nodes will be added in a Queue. The adjacent
nodes of node 2 are 1, 3, 5, 6, and 4. Since the nodes 1 and 3 have already
been visited, and 4, 5, 6 are already added in the Queue; therefore, we do not
need to insert any node in the Queue.

20
Amity School of Engineering & Technology

The next element is 5. So, 5 would be deleted from the Queue.


It gets printed and marked as visited as shown below:

Once node 5 gets removed from the Queue, then all the adjacent nodes of node 5
except the visited nodes will be added in the Queue. The adjacent nodes of node
5 are 1 and 2. Since both the nodes have already been visited; therefore, there is
no vertex to be inserted in a Queue.
The next node is 6. So, 6 would be deleted from the Queue. It gets printed and
marked as visited as shown below:

21
Amity School of Engineering & Technology

Once the node 6 gets removed from the Queue, then all the adjacent nodes of node 6
except the visited nodes will be added in the Queue. The adjacent nodes of node 6 are 1
and 4. Since the node 1 has already been visited and node 4 is already added in the
Queue; therefore, there is not vertex to be inserted in the Queue.

22
Amity School of Engineering & Technology

DFS stands for Depth First Search. In DFS traversal, the stack data structure is used,
which works on the LIFO (Last In First Out) principle. In DFS, traversing can be
started from any node, or we can say that any node can be considered as a root node
until the root node is not mentioned in the problem.
In the case of BFS, the element which is deleted from the Queue, the adjacent nodes of
the deleted node are added to the Queue. In contrast, in DFS, the element which is
removed from the stack, then only one adjacent node of a deleted node is added in the
stack.

23
Amity School of Engineering & Technology

24
Amity School of Engineering & Technology

Consider node 0 as a root node.


First, we insert the element 0 in the stack as shown below:

The node 0 has two adjacent nodes, i.e., 1 and 3. Now we can
take only one adjacent node, either 1 or 3, for traversing. Suppose
we consider node 1; therefore, 1 is inserted in a stack and gets
printed as shown below:

25
Amity School of Engineering & Technology

Now we will look at the adjacent vertices of node 1.


The unvisited adjacent vertices of node 1 are 3, 2, 5 and
6. We can consider any of these four vertices. Suppose
we take node 3 and insert it in the stack as shown
below:

26
Amity School of Engineering & Technology

Consider the unvisited adjacent vertices of


node 3. The unvisited adjacent vertices of node
3 are 2 and 4. We can take either of the
vertices, i.e., 2 or 4. Suppose we take vertex 2
and insert it in the stack as shown below:

27
Amity School of Engineering & Technology

The unvisited adjacent vertices of node 2 are 5


and 4. We can choose either of the vertices, i.e.,
5 or 4. Suppose we take vertex 4 and insert in
the stack as shown below:

28
Amity School of Engineering & Technology

Now we will consider the unvisited adjacent vertices of node 4. The unvisited
adjacent vertex of node 4 is node 6. Therefore, element 6 is inserted into the stack as
shown below:

After inserting element 6 in the stack, we will look at the unvisited


adjacent vertices of node 6. As there is no unvisited adjacent
vertices of node 6, so we cannot move beyond node 6. In this case,
we will perform backtracking. 

29
Amity School of Engineering & Technology

30
Amity School of Engineering & Technology

In the water jug problem in Artificial Intelligence, we are provided with two


jugs: one having the capacity to hold 3 gallons of water and the other has the
capacity to hold 4 gallons of water. There is no other measuring equipment
available and the jugs also do not have any kind of marking on them. So, the
agent’s task here is to fill the 4-gallon jug with 2 gallons of water by using only
these two jugs and no other material. Initially, both our jugs are empty.

31
Amity School of Engineering & Technology

Production rules for solving the water jug problem


Here, let x denote the 4-gallon jug and y denote the 3-gallon jug.
S.No. Initial State Condition Final state Description of action taken

1. (x,y) If x<4 (4,y) Fill the 4 gallon jug completely

2. (x,y) if y<3 (x,3) Fill the 3 gallon jug completely

3. (x,y) If x>0 (x-d,y) Pour some part from the 4 gallon jug

4. (x,y) If y>0 (x,y-d) Pour some part from the 3 gallon jug

5. (x,y) If x>0 (0,y) Empty the 4 gallon jug

6. (x,y) If y>0 (x,0) Empty the 3 gallon jug

7. (x,y) If (x+y)<7 (4, y-[4-x]) Pour some water from the 3 gallon jug
to fill the four gallon jug

8. (x,y) If (x+y)<7 (x-[3-y],y) Pour some water from the 4 gallon jug
to fill the 3 gallon jug.

9. (x,y) If (x+y)<4 (x+y,0) Pour all water from 3 gallon jug to the 4
gallon jug

10. (x,y) if (x+y)<3 (0, x+y) Pour all water from the 4 gallon jug to
the 3 gallon jug

32
Amity School of Engineering & Technology

The listed production rules contain all the actions that could be performed by
the agent in transferring the contents of jugs. But, to solve the water jug
problem in a minimum number of moves, following set of rules in the given
sequence should be performed:

Solution of water jug problem according to the production rules


S.No. 4 gallon jug 3 gallon jug Rule followed
contents contents
1. 0 gallon 0 gallon Initial state
2. 0 gallon 3 gallons Rule no.2
3. 3 gallons 0 gallon Rule no. 9
4. 3 gallons 3 gallons Rule no. 2
5. 4 gallons 2 gallons Rule no. 7
6. 0 gallon 2 gallons Rule no. 5
7. 2 gallons 0 gallon Rule no. 9

33
Amity School of Engineering & Technology

34

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy