
11. Cache Organization, Operation, and Coherency

11.2 Overview of Cache Operations
As described earlier, caches provide fast temporary data storage, and they make the speedup of memory accesses transparent to the user. In general, the processor accesses cache-resident instructions or data through the following procedure:
1. The processor, through the on-chip cache controller, attempts to access the next instruction or data in the primary cache.
2. The cache controller checks to see if this instruction or data is present in the primary cache.
- If the instruction/data is present, the processor retrieves it. This is called a primary-cache hit.
- If the instruction/data is not present in the primary cache, the cache controller must retrieve it from the secondary cache or memory. This is called a primary-cache miss.
3. If a primary-cache miss occurs, the cache controller checks to see if the instruction/data is in the secondary cache.
- If the instruction/data is present in the secondary cache, it is retrieved and written into the primary cache.
- If the instruction/data is not present in the secondary cache, it is retrieved as a cache line (a block whose size set in the Config register; see the section titled Variable-Length Cache Lines in this chapter for available cache line lengths) from memory and is written into both the secondary cache and the appropriate primary cache.
4. The processor retrieves the instruction/data from the primary cache and operation continues.
It is possible for the same data to be in three places simultaneously: main memory, secondary cache, and primary cache. This data is kept consistent through the use of write back methodology; that is, modified data is not written back to memory until the cache line is replaced.

Copyright 1996, MIPS Technologies, Inc. -- 21 MAR 96




Generated with CERN WebMaker
