Victim Cache Simulator

One approach to lower the miss penalty is to remember what was discarded in case it is needed again. Since the discarded data has already been fetched, it can be used again at a small cost. Such recycling is possible using a victim cache. Victim cache was originally proposed as an approach to reduce the conflict misses of direct mapped caches without affecting its fast access time. Victim cache is a fully associative cache, whose size is typically 4 to 16 cache lines, residing between a direct mapped L1 cache and the next level of memory. On a main cache miss, before going to the next level, the victim cache is checked. If the address hits in the victim cache the desired data is returned to the CPU and also promoted to the main cache by replacing its conflicting competitor. The data evicted from the main cache is transferred to the victim cache. In case of a miss in victim cache the next level of memory is accessed and arriving data fills the line in main cache while moving the current data to victim cache. In this case the replaced entry in the victim cache is discarded and, if dirty, written back to the next level of memory.

Start Victim Cache Simulator


  1. Computer Architecture: A Quantitative Approach by John L. Hennessy and David A.Patterson.
  2. A Cache Primer by Paul Genua, P.E., 2004, Freescale Semiconductor.
  3. Improving Data Cache Performance with Integrated Use of Split Caches, Victim Cache and Stream Buffers, Afrin Naz, Mehran Rezaei, Krishna Kavi and Philip Sweany. Proceedings of the 2004 workshop on MEmory performance: DEaling with Applications , systems and architecture.