200 likes | 577 Views
CACHE MAPPING PROCESS. CACHE MAPPING PROCESS. TOPICS TO BE DISCUSSED. CACHE MEMORY CACHE MAPPING PROCESS ASSOCIATIVE MAPPING DIRECT MAPPING SET ASSOCIATIVE MAPPING COMPARISON. CACHE MEMORY. Cache memory is a high speed memory.
E N D
CACHE MAPPING PROCESS CACHE MAPPING PROCESS
TOPICS TO BE DISCUSSED • CACHE MEMORY • CACHE MAPPING PROCESS • ASSOCIATIVE MAPPING • DIRECT MAPPING • SET ASSOCIATIVE MAPPING • COMPARISON
CACHE MEMORY Cache memory is a high speed memory. To overcome the speed mismatch between CPU and main memory, a cache memory, whose access time is close to the CPU processing speed, is introduced between CPU and main memory. CPU first check cache memory, if data is present then it is read from cache memory otherwise it is read from main memory and put in cache for further readings. BACK
CACHE MAPPING PROCESS Cache memory is smaller as compared to main memory. Word from main memory will need to be loaded and reloaded into cache memory. The concept of how main memory word is associated with cache memory is called mapping. Three different ways of mapping • Associative mapping( or free mapping) • Direct mapping( or fixed mapping) • Set-Associative mapping BACK
ASSOCIATIVE MAPPING • In this method, both the word and address of word are stored in cache memory. Address of word in cache memory is also known as address tag. • The address bits sent by CPU to search are matched with the addresses stored in cache memory. • If any address is matched the corresponding word is fetched from the cache and sent to CPU. • If no match is found in cache memory then word is searched in the main memory. The word along with its address is then copied from main memory into cache.
BLOCK DIAGRAM CPU word address address Word 0 0 2 5 0 1 1 1 0 10235 1201 Address matched 00121 2345 01110 0025 . . . . . . . . BACK 11001 4210 CACHE MEMORY
DIRECT MAPPING • In this method, the address sent by CPU is divided into two parts called tag field and index field. • The index field has number of bits equal to the number of bits required to address a word in cache. If a computer has main memory of capacity 2m and cache memory of 2n then address bits will be divided into n bits index field and (m-n) bits as tag field. • In this method cache memory stores the word as tag field. The words will be stored at that location in cache which is represented by index fields of their addresses. • When address is sent by CPU, the index part of address is used to get a location in cache memory. If the tag stored at that address matches the tag field of the requested address, the word is fetched. • Otherwise if tag does not match , the word is searched in main memory and word is stored in cache with new tag.
BLOCK DIAGRAM BACK CPU address 01110 0025 tag index 01 110 Index field word Tag field 235 121 110 . . . 001 1201 10 2345 00 0025 01 . . . . . . . . . 01110 0025 4201 11 Memory address Main memory Cache memory
SET ASSOCIATIVE MAPPING • The problem of direct mapping is that two words with the same index in their address but with different tag values cannot reside in cache memory at the same time. • This problem is overcome in set associative mapping where more than one word-tag pair can be stored in cache against a single index. • If two words tags can be stored, it is called a two-way set associative mapping. If three tags are stored, it is called a three-way associative mapping and so on.
tags words BLOCK DIAGRAM 011000 10100101 01 10100101 10 index BACK CACHE MEMORY
COMPARISON Cache type Hit ratio Search speed Direct mapped Fully associative Set associative Good Best Very good Best Moderate Good BACK