0% found this document useful (0 votes)
13 views13 pages

Coa (21CS34)

Uploaded by

nhsanjana85
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views13 pages

Coa (21CS34)

Uploaded by

nhsanjana85
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

CACHE MEMORIES

What is Cache Memory?


• Cache memory is a small, high-speed RAM buffer located between
the CPU and main memory.
• Cache memory holds a copy of the instructions (instruction cache)
or data (operand or data cache) currently being used by the CPU.
• The main purpose of a cache is to accelerate your computer while
keeping the price of the computer low.
Types of Cache Mapping
1. Direct Mapping
2. Associative Mapping
3. Set Associative Mapping
DIRECT MAPPING
• The block-j of the main-memory maps onto block-j modulo-128 of the cache
• When the memory-blocks 0, 128, & 256 are loaded into cache, the block is stored in
cache-block 0. Similarly, memory-blocks 1, 129, 257 are stored in cache-block 1. The
contention may arise when
1) When the cache is full.
2) When more than one memory-block is mapped onto a given cache-block
position.
• The contention is resolved by allowing the new blocks to overwrite the currently
resident-block.
• Memory-address determines placement of block in the cache.
Associative mapping
• The memory-block can be placed into any cache-block position.
• 12 tag-bits will identify a memory-block when it is resolved in the cache.
• Tag-bits of an address received from processor are compared to the tag-bits of
each block of cache.
• This comparison is done to see if the desired block is present
• It gives complete freedom in choosing the cache-location.
• A new block that has to be brought into the cache has to replace an existing
block if the cache is full.
• The memory has to determine whether a given block is in the cache.
Set associative
• It is the combination of direct and associative mapping.
• The blocks of the cache are grouped into sets.
• The mapping allows a block of the main-memory to reside in any block of the
specified set.
• The cache has 2 blocks per set, so the memory-blocks 0, 64, 128…….. 4032 maps
into cache set 0.
• The cache can occupy either of the two block position within the set.
Replacement algorithms
• Replacement algorithms are used when there are no available space in a cache in
which to place a data.
Four of the most common cache replacement algorithms are described below: 
• Least Recently Used (LRU): The LRU algorithm selects for replacement the item
that has been least recently used by the CPU.
• First-In-First-Out (FIFO): The FIFO algorithm selects for replacement the item that
has been in the cache from the longest time.
• Least Frequently Used (LRU): The LRU algorithm selects for replacement the
item that has been least frequently used by the CPU.
• Random: The random algorithm selects for replacement the item randomly.
• In direct mapping method,

the position of each block is pre-determined and there is no need of replacement


strategy.
• In associative & set associative method,
• The block position is not pre-determined. If the cache is full and if new blocks are
brought into the cache, then the cache-controller must decide which of the old
blocks has to be replaced.
• When a block is to be overwritten, the block with longest time w/o being
referenced is overwritten
THANK YOU

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy